最新消息最新消息

华为团队近日在人工智能领域取得了重要突破,他们改进了Transformer架构,推出了性能超过LLaMA的盘古-π架构。这一成果凸显出我国在人工智能技术研发方面的领先地位。

盘古-π架构是在传统Transformer基础上进行改进而来的,通过增强非线性,有效降低了特征塌陷问题。这使得模型输出表达能力更强,同时在多任务上超越了同等规模的LLaMA 2大模型。此外,盘古-π(7B)还能实现10%的推理加速,1B规模上更是达到了SOTA水平。

值得一提的是,华为还基于这一架构研发了一个金融法律大模型“云山”。这一成果不仅展示了华为在人工智能领域的强大实力,也为我国金融和法律行业提供了强大的技术支持。

英文翻译:
Huawei team recently made an important breakthrough in the field of artificial intelligence, improving the performance of the Pangu-π architecture over LLaMA. This achievement highlights China’s leading position in artificial intelligence technology research and development.

The Pangu-π architecture is an improvement of the traditional Transformer architecture, enhancing non-linearity to effectively reduce the problem of feature collapse. This makes the model’s output more expressive and surpasses同等规模的LLaMA 2 in multi-tasking. In addition, Pangu-π (7B) can achieve 10% acceleration in reasoning, and reaches SOTA level at 1B scale.

It is worth mentioning that Huawei also developed a financial and legal large model called “Yunshan” based on this architecture. This achievement not only demonstrates Huawei’s strong presence in the field of artificial intelligence but also provides powerful technical support for China’s finance and legal industries.

【来源】https://mp.weixin.qq.com/s/Beg3yNa_dKZKX3Fx1AZqOw

Views: 1

发表回复

您的电子邮箱地址不会被公开。 必填项已用 * 标注