**Mistral AI发布新的开源大模型Mixtral MoE 8x7B,引发业界关注**
近日,人工智能初创公司Mistral AI发布了一款新的开源大模型Mixtral MoE 8x7B。这家近期融资近5亿美元的明星公司在X平台上仅发布了一个大型BT种子文件的链接,用于下载该新模型,并未进行任何宣传。这一举动引发了业界的广泛关注。
Reddit上的一名网友分析称,这款新模型为“缩小版的GPT-4”,似乎是由8个7B专家模型组成的混合模型。这一消息迅速在业界引起了热议,许多专家和业内人士对这款新模型的性能和应用前景表示出浓厚的兴趣。
Mistral AI作为一家备受关注的人工智能初创公司,其此次发布的新模型无疑为整个行业带来了新的挑战和机遇。一方面,Mixtral MoE 8x7B的出现可能会对其他大型语言模型产生竞争压力;另一方面,这款新模型的开源性质有助于推动整个行业的发展,促进技术的交流与合作。
英语如下:
====
“News Headline: Mistral AI Mysteriously Releases 8×7====
“News Headline: Mistral AI Mysteriously Releases 8x7B Large Model, Challenging GPT-4
Keywords: Mistral AI, Open Source Large Model, GPT-4
News Content: **Mistral AI Releases New Open Source Large Model Mixtral MoE 8x7B, Attracting Industry Attention**
Recently, the artificial intelligence startup company Mistral AI released a new open source large model called Mixtral MoE 8x7B. The star company, which has recently secured nearly $500 million in funding, only released a link to a large BT seed file on the X platform for downloading this new model without any publicity. This move has sparked widespread attention in the industry.
A netizen on Reddit analyzed that this new model is a “smaller version of GPT-4″, seemingly composed of eight 7B expert models as a hybrid model. This news quickly sparked heated discussions in the industry, with many experts and insiders showing strong interest in the performance and application prospects of this new model.
As a highly regarded artificial intelligence startup, Mistral AI’s release of this new model undoubtedly brings new challenges and opportunities to the entire industry. On one hand, the emergence of Mixtral MoE 8x7B may create competitive pressure on other large language models; on the other hand, the open source nature of this new model helps to promote the development of the entire industry and facilitate technological exchange and cooperation.”
【来源】https://venturebeat.com/ai/mistral-ai-bucks-release-trend-by-dropping-torrent-link-to-new-open-source-llm/
Views: 1