今天,国内AI大模型领域的独角兽企业零一万物在成立一周年之际,推出了重量级的系列产品升级。其中,闭源模型领域迎来了全球SOTA千亿参数闭源大模型Yi-Large,该模型的评测结果部分超越了GPT-4,显示出强大的竞争力。零一万物创始人兼CEO李开复透露,公司正在训练更大参数规模的MoE模型Yi-XLarge MoE,这预示着零一万物在未来可能会推出更加先进的产品。
在开源领域,零一万物将之前发布的Yi-34B、Yi-9B、6B中小尺寸开源模型版本升级为Yi-1.5系列,并且每个版本都达到了同尺寸中的SOTA性能最佳,这进一步巩固了零一万物在开源模型领域的领先地位。
在媒体交流环节中,李开复分享了零一万物的C端海外生产力应用今年的收入预期,预计将达到1~2亿元人民币,主要收入来源于国外用户的订阅付费。这一数据充分展示了零一万物在国际市场上的竞争力和商业潜力。
零一万物的一系列动作,无论是发布闭源模型Yi-Large,还是升级开源模型版本,都彰显了其在大模型领域的技术实力和商业布局。未来,随着更大参数规模的MoE模型Yi-XLarge MoE的训练完成,我们有理由相信零一万物将会带来更多的惊喜。
英语如下:
**Headline:** “Zero-One Wu releases closed-source model Yi-Large with千亿parameters, boosting AI model innovation and upgrade”
Keywords: Zero-One Wu, AI model upgrade, overseas revenue growth.
**News Content:**
#### Zero-One Wu Releases Closed-Source Model Yi-Large with千亿Parameters and Upgrades Open-Source Model Version
Today, on the anniversary of its establishment, Zero-One Wu, a unicorn enterprise in the domestic AI large-model field, launched a series of significant product upgrades. In the field of closed-source models, the company introduced the globally state-of-the-art (SOTA) billion-parameter closed-source large model Yi-Large, which has achieved part of the evaluation results surpassing GPT-4, demonstrating its strong competitiveness. Lei Kai, the founder and CEO of Zero-One Wu, revealed that the company is training a larger parameter-scale MoE model, Yi-XLarge MoE, which indicates that Zero-One Wu may launch even more advanced products in the future.
In the open-source field, Zero-One Wu upgraded the previously released open-source model versions Yi-34B, Yi-9B, and 6B of medium and small sizes to the Yi-1.5 series, and each version achieved the best SOTA performance in the same size, further consolidating Zero-One Wu’s leading position in the open-source model field.
During the media communication session, Lei Kai shared the revenue expectations of Zero-One Wu’s C-end overseas productivity applications this year, which is expected to reach 100 to 200 million yuan, with the main income来源 from overseas users’ subscription payments. This data fully demonstrates Zero-One Wu’s competitive力和 commercial potential in the international market.
Zero-One Wu’s series of moves, whether it’s the release of the closed-source model Yi-Large or the upgrade of the open-source model version, highlight the company’s technical strength and business layout in the large-model field. In the future, with the completion of the training of the larger parameter-scale MoE model Yi-XLarge MoE, there is reason to believe that Zero-One Wu will bring more surprises.
【来源】https://zhidx.com/p/424764.html
Views: 1