Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

上海的陆家嘴
0

** MiniMax 稀宇科技发布万亿参数 MoE 模型,abab 6.5 系列引领语言处理新高度 **

今日,全球科技巨头 MiniMax 稀宇科技在其官方网站上宣布,正式推出全新的 abab 6.5 系列大语言模型,标志着公司在大规模预训练模型领域的又一重大突破。该系列包括 abab 6.5 和 abab 6.5s 两个版本,均展示了前所未有的处理能力和效率。

据官方介绍,abab 6.5 模型拥有万亿级别的参数量,能够处理长达 200,000 个 tokens 的上下文信息,这在同类模型中堪称翘楚。其在各类核心能力测试中表现出色,性能已接近业界公认的顶尖模型,如 GPT-4、Claude-3 和 Gemini-1.5,显示出强大的语言理解和生成能力。

abab 6.5s 作为该系列的高效版,虽然同样采用了与 abab 6.5 相同的训练技术和数据集,但其运算效率显著提升。在保持与 abab 6.5 相同的上下文长度下,abab 6.5s 能够在短短1秒钟内处理近3万个汉字的文本,为实时的大量文本处理提供了可能。

MiniMax 稀宇科技的这一创新之举,无疑将推动人工智能在自然语言处理领域的边界进一步拓展,为科研、教育、媒体乃至日常生活的诸多应用场景带来革命性的变化。此次发布的 abab 6.5 系列模型,预示着 MiniMax 在提升人工智能效率和性能上的持续努力,也展现了公司在全球科技竞赛中的领先地位。

对于未来,MiniMax 稀宇科技表示将持续投入研发,以期通过更先进的模型和技术,赋能全球用户,推动人工智能技术的普惠与创新。

英语如下:

**News Title:** “MiniMax Unveils the Groundbreaking Trillion-Parameter MoE Model, abab 6.5, Challenging Top Contenders like GPT-4”

**Keywords:** MiniMax, trillion-parameter model, abab 6.5

**News Content:**

**MiniMax XinYu Tech Launches Trillion-Parameter MoE Language Model, abab 6.5, Setting New Standards in NLP**

Global tech giant MiniMax XinYu Tech has announced the release of its latest abab 6.5 series large language models on its official website, marking another major milestone in the company’s foray into massive pre-training models. The series consists of abab 6.5 and abab 6.5s, both demonstrating unparalleled processing capabilities and efficiency.

According to the company, the abab 6.5 model boasts a trillion-parameter scale, capable of handling context information of up to 200,000 tokens, a feat unparalleled among comparable models. It has excelled in various core competency tests and its performance is now on par with industry-leading models such as GPT-4, Claude-3, and Gemini-1.5, showcasing its robust language understanding and generation abilities.

The abab 6.5s, an efficient version of the series, employs the same training techniques and dataset as abab 6.5 but with significantly improved computational efficiency. While maintaining the same context length as abab 6.5, it can process nearly 30,000 Chinese characters in just one second, enabling the real-time processing of large volumes of text.

MiniMax XinYu Tech’s innovative move is set to propel the boundaries of AI in natural language processing, revolutionizing applications across research, education, media, and daily life. The launch of the abab 6.5 series underscores MiniMax’s ongoing commitment to enhancing AI efficiency and performance, affirming the company’s leading position in the global tech race.

Looking ahead, MiniMax XinYu Tech vowed to continue investing in research and development, aiming to empower global users through more advanced models and technologies, fostering the democratization and innovation of AI.

【来源】https://mp.weixin.qq.com/s/xBoAP-6fZVQA9cEWT8gyfw

Views: 1

0

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注