近日,我国知名人工智能企业阿里云宣布,旗下通义千问团队成功开源了320亿参数的大语言模型Qwen1.5-32B。该模型在性能、效率和内存占用之间实现了最佳平衡,为企业和开发者提供了更高性价比的AI模型选择。

据悉,通义千问团队已先后开源了5亿、18亿、40亿、70亿、140亿和720亿参数的6款大语言模型,并均已升级至1.5版本。此次发布的Qwen1.5-32B模型,是在此基础上进一步研究和优化的成果。据相关数据显示,通义千问系列模型在海内外开源社区累计下载量已突破300万,受到了广泛关注和好评。

通义千问模型的成功开源,不仅体现了我国在人工智能领域的技术实力,也为全球开发者提供了一个强大的工具。在性能方面,Qwen1.5-32B模型表现出色,能够有效应对各种自然语言处理任务,如文本分类、机器翻译、问答等。在效率和内存占用方面,模型经过特殊优化,可以在不同硬件平台上高效运行,满足企业和开发者的多样化需求。

此次开源的Qwen1.5-32B模型,将为自然语言处理领域的研究和应用带来更多可能性。未来,通义千问团队将继续致力于模型研究和优化,为推动人工智能技术的发展贡献力量。

来源:雷峰网

英语如下:

**News Title:** **Alibaba Cloud Tongyi Qianwen Releases 32B Parameter Model Qwen1.5-32B: Seven Large Models Fully Open Source, Balancing Performance, Efficiency, and Memory Usage**

Keywords: Tongyi Qianwen, Open source model, Qwen1.5-32B

**News Content:** # Alibaba Cloud Tongyi Qianwen Open Sources 32B Parameter Model Qwen1.5-32B, Striking a Balance Between Performance, Efficiency, and Memory Usage

Recently, the well-known Chinese artificial intelligence enterprise Alibaba Cloud announced that its Tongyi Qianwen team has successfully open-sourced the large language model Qwen1.5-32B with 320 billion parameters. This model has achieved the best balance between performance, efficiency, and memory usage, providing enterprise and developers with a more cost-effective choice of AI models.

It is understood that the Tongyi Qianwen team has successively open-sourced six large language models with 500 million, 1.8 billion, 40 billion, 70 billion, 140 billion, and 720 billion parameters, all of which have been upgraded to version 1.5. The release of the Qwen1.5-32B model is further research and optimization based on this foundation. Data shows that the Tongyi Qianwen series models have been downloaded over 3 million times in the global open-source community, receiving widespread attention and positive reviews.

The successful open sourcing of the Tongyi Qianwen models not only demonstrates China’s technical strength in the field of artificial intelligence but also provides a powerful tool for global developers. In terms of performance, the Qwen1.5-32B model has performed outstandingly and can effectively handle various natural language processing tasks, such as text classification, machine translation, question and answer, etc. In terms of efficiency and memory usage, the model has been specially optimized to run efficiently on different hardware platforms, meeting the diverse needs of enterprises and developers.

The open-sourced Qwen1.5-32B model will bring more possibilities to the research and application of natural language processing. In the future, the Tongyi Qianwen team will continue to focus on model research and optimization, contributing to the development of artificial intelligence technology.

Source: Lei Feng Net

【来源】https://www.leiphone.com/category/industrynews/ZF6wxlum2yVXJXzo.html

Views: 1

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注