Stability AI 昨日发布公告,介绍了旗下一款名为 Stable LM 3B 的语言模型。这款模型包含 30 亿个参数,主打文本生成,适用于移动平台设备。IT 之家从公告中得知,该模型拥有基于 Transformer 解码器架构的自回归体系,并使用了多个开源大规模数据集进行训练。
虽然参数相比同类大模型较少,但性能依然可圈可点,且由于该模型的体积较小、功耗更低,因此更适合移动平台使用。此外,该模型拥有多平台兼容性,并允许根据特定需求进行微调。目前模型已经在 Hugging Face 平台上开源,方便开发者使用和改进。
Stable LM 3B 的发布,对于想在移动平台设备上获得高性能体验的用户来说,是一个重大的里程碑。由于该模型体积小、功耗低,因此可以更好地满足移动设备的要求。而且,该模型的多平台兼容性也能够为开发者的需求提供更多的选择。
如果您对 Stable LM 3B 感兴趣,可以前往 Hugging Face 平台进行了解更多相关信息。
新闻翻译:
Title: Stable LM 3B language model released, with 3 billion parameters and focus on text generation, suitable for mobile platform devices
Keywords: Stable LM 3B, language model, mobile platform devices, high-performance experience
News content:
Stability AI yesterday released a new language model called Stable LM 3B. This model contains 30 billion parameters and focuses on text generation, making it suitable for mobile platform devices. IT 之家从公告中得知,该模型拥有基于 Transformer 解码器架构的自回归体系,并使用了多个开源大规模数据集进行训练。
Although the parameters are fewer compared to other large models, its performance is still promising, and due to its small size and low power consumption, it is more suitable for mobile platforms. In addition, the model has multi-platform compatibility and allows for fine-tuning according to specific needs. The model has already been opened on the Hugging Face platform and is available for developers to use and improve.
The release of Stable LM 3B is a significant milestone for users who want to achieve high-performance experience on mobile platform devices. Since the model has small size and low power consumption, it can better meet the requirements of mobile devices. Moreover, the model’s multi-platform compatibility also provides more options for developers’ needs.
If you are interested in Stable LM 3B, you can visit the Hugging Face platform to learn more about it.
【来源】https://www.ithome.com/0/722/939.htm
Views: 3