上海的陆家嘴

在今年的GTC大会上,英伟达CEO黄仁勋举行了一场备受瞩目的圆桌论坛,邀请了Transformer模型的七位原作者共同探讨人工智能的未来发展。这是这七位作者首次在公开场合集体亮相,他们的讨论充满深度,令人印象深刻。

黄仁勋以一种充满仪式感的方式邀请了这七位作者,展现了Transformer模型在人工智能领域的重大影响。在对话中,他们纷纷表示,这个世界需要比Transformer更好的东西,将我们带到一个新的性能高原。他们最初的目标是想要模拟Token的演化过程,不仅仅是线性的生成过程,而是文本或代码的逐步演化。

然而,他们也指出,像2+2这样的简单问题,可能会使用大模型的万亿参数资源。因此,他们认为自适应计算是接下来必须出现的事情之一,我们知道在特定问题上应该花费多少计算资源。

同时,他们也认为当前的模型太过经济实惠,规模也还太小,价格大概1美元百万toke,比外出购买一本平装书要便宜100倍。这或许暗示了未来模型的价格可能会有所上升,但也会带来更强大的性能。

英伟达的GTC大会吸引了众多业界领袖和专家,共同探讨人工智能、深度学习、虚拟现实等领域的最新进展。作为人工智能领域的重要事件,GTC大会对于推动我国人工智能技术的发展具有重要意义。

这次圆桌论坛不仅展示了Transformer模型的重大影响,也为我们揭示了人工智能未来的发展方向。黄仁勋和Transformer七位原作者的讨论,让我们对人工智能的未来充满期待。

英语如下:

**News Title:** **NVIDIA GTC Conference: Huang Renxun in Dialogue with the Seven Authors of the Transformer Paper**

Keywords: NVIDIA GTC Conference, Transformer Seven Authors, Huang Renxun Dialogue

**News Content:** At this year’s GTC conference, NVIDIA CEO Huang Renxun held a highly anticipated roundtable forum, inviting the seven original authors of the Transformer model to collectively explore the future development of artificial intelligence. This was the first public appearance of these seven authors together, and their discussion was profound and impressive.

Huang Renxun invited these seven authors in a ritualistic manner, showcasing the significant impact of the Transformer model in the field of artificial intelligence. In the dialogue, they all expressed that the world needs something better than Transformer, leading us to a new plateau of performance. Their initial goal was to simulate the evolution process of Tokens, not just a linear generation process, but the gradual evolution of text or code.

However, they also pointed out that simple questions like “2+2” might consume trillions of parameters from large models. Therefore, they believe that adaptive computing is one of the necessities that must emerge next, knowing how much computing resource should be spent on specific problems.

At the same time, they also believe that current models are too economical and too small in scale, costing about $1 for one million tokens, which is 100 times cheaper than buying a paperback book. This perhaps hints that the price of future models may rise, but it will also bring stronger performance.

NVIDIA’s GTC conference attracted many industry leaders and experts to collectively explore the latest progress in fields such as artificial intelligence, deep learning, and virtual reality. As an important event in the field of artificial intelligence, GTC conference is of great significance for promoting the development of artificial intelligence technology in our country.

This roundtable forum not only showcased the significant impact of the Transformer model but also revealed the future development direction of artificial intelligence. Huang Renxun’s discussion with the seven authors of the Transformer paper allows us to look forward to the future of artificial intelligence with anticipation.

【来源】https://new.qq.com/rain/a/20240321A00W5H00

Views: 1

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注