人工智能公司Stability AI近日推出了一款名为Stable LM 2 1.6B的语言模型,该模型拥有16亿参数,体积小巧且效率极高。Stable LM 2 1.6B是在多语言数据上训练而成的,这些语言包括英语、西班牙语、德语、意大利语、法语、葡萄牙语和荷兰语。

据Stability AI介绍,Stable LM 2 1.6B在多数基准测试中表现优异,优于其他参数低于20亿个的小型语言模型。这包括微软的Phi-2 (2.7B)、TinyLlama 1.1B和Falcon 1B等模型。

Stable LM 2 1.6B的推出标志着人工智能技术的发展进入了一个新的阶段,使得小型语言模型在保持高效的同时,也能达到大模型的性能水平。这对于需要在资源有限的环境下使用语言模型的用户来说,无疑是一个巨大的福音。

英文翻译:

Stability AI Unveils Efficient Language Model Stable LM 2 1.6B

Keywords: Stability AI, Language Model, Stable LM 2 1.6B

Stability AI has recently launched a language model called Stable LM 2 1.6B, which features 16 billion parameters and is compact and efficient. Stable LM 2 1.6B is trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.

According to Stability AI, Stable LM 2 1.6B has demonstrated excellent performance in most benchmark tests, outperforming other small language models with parameters below 20 billion, including Microsoft’s Phi-2 (2.7B), TinyLlama 1.1B, and Falcon 1B.

The introduction of Stable LM 2 1.6B signifies a new stage in the development of artificial intelligence technology, enabling small language models to achieve the performance of large models while maintaining efficiency. This is a significant benefit for users who need to use language models in resource-constrained environments.

【来源】https://stability.ai/news/introducing-stable-lm-2

Views: 1

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注