新华社北京讯——今日,MiniMax开放平台宣布,旗下大语言模型abab6全量上线,成为国内首个基于MoE(Mixture-of-Experts)结构的大语言模型。abab6在保持大规模参数带来的强大任务处理能力的同时,通过MoE架构显著提升了训练数据量和计算效率。
据MiniMax开放平台介绍,abab6模型的推出,标志着我国在人工智能领域的大语言模型研究和应用迈出了重要一步。MoE结构允许模型在不同领域专家之间分配知识,这不仅增强了模型对复杂问题的理解能力,而且提高了模型训练的效率。
在当前人工智能技术快速发展的背景下,大语言模型的计算效率和处理能力成为衡量技术水平的关键指标。abab6的上线,将为国内广大研究人员和企业提供强大的语言处理能力,推动智能语音、自然语言理解、机器翻译等多个领域的创新和发展。
MiniMax平台表示,未来将继续探索更高效、更智能的语言模型,以支持更广泛的应用场景,助力我国人工智能技术的进步。
英文翻译:
BEIJING, XINHUA – The MiniMax open platform has announced the full-scale launch of its large language model ‘abab6’, making it the first large language model based on the MoE (Mixture-of-Experts) structure in China. While maintaining the capabilities afforded by its large-scale parameters, abab6 has significantly improved its training data volume and computational efficiency through the MoE architecture.
According to the MiniMax open platform, the launch of the abab6 model signifies an important step forward in China’s research and application of large language models in the field of artificial intelligence. The MoE structure allows for the distribution of knowledge among different domain experts, not only enhancing the model’s understanding of complex issues but also increasing the efficiency of model training.
In the context of the rapid development of artificial intelligence technology, the computational efficiency and processing capabilities of large language models have become key indicators of technological advancement. The launch of abab6 will provide a strong language processing capability to a wide range of researchers and enterprises in China, promoting innovation and development in fields such as intelligent speech, natural language understanding, and machine translation.
The MiniMax platform stated that it will continue to explore more efficient and intelligent language models in the future, to support a broader range of application scenarios and contribute to the progress of China’s artificial intelligence technology.
【来源】https://mp.weixin.qq.com/s/2aFhRUu_cg4QFdqgX1A7Jg
Views: 1