硅谷创新企业Groq的CEO Jonathan Ross近日在接受媒体采访时透露,预计到2024年前,大多数人工智能(AI)创业公司将会采用速度更快的LPU(Language Processing Unit)芯片。Groq公司正致力于研发这种专为大语言模型推理设计的新型AI芯片,旨在解决目前AI推理过程中的效率和成本问题。

Ross展示了一款由Groq技术驱动的音频聊天机器人,该机器人以破纪录的响应速度引起了关注。他指出,目前AI推理的高昂成本是许多初创公司面临的一大挑战。Groq的LPU芯片则提供了“超快”且更经济的解决方案,为大模型的运行提供高效支持。

“我们致力于为初创公司提供更友好的基础设施,”Ross表示,“到今年年底,Groq很可能成为这些公司首选的芯片供应商。我们的产品不仅性能卓越,而且价格极具竞争力。”Ross的这一言论预示着Groq的LPU芯片有可能重塑AI行业的计算格局,为众多AI创业公司带来技术革新和成本优势。

英语如下:

News Title: “Groq CEO Forecasts Most AI Startups to Adopt Faster LPU Chips by 2024”

Keywords: AI chips, LPU, startups

News Content: In a recent media interview, Jonathan Ross, CEO of Silicon Valley innovator Groq, predicted that the majority of artificial intelligence (AI) startups will adopt faster LPU (Language Processing Unit) chips before 2024. Groq is currently developing this novel AI chip, specifically designed for large language model inference, aiming to address efficiency and cost issues in the current AI inference process.

Ross showcased an audio chatbot powered by Groq’s technology, which drew attention for its record-breaking response speed. He highlighted the high cost of AI inference as a significant challenge for many startups. Groq’s LPU chips offer a “superfast” and more economical solution, providing efficient support for the operation of large models.

“We are dedicated to offering more accessible infrastructure to startups,” Ross said. “By the end of this year, Groq is likely to become the preferred chip supplier for these companies. Our products boast exceptional performance and are competitively priced.” Ross’s statement suggests that Groq’s LPU chips could potentially reshape the computing landscape in the AI industry, bringing technological innovation and cost advantages to numerous AI startups.

【来源】https://wallstreetcn.com/articles/3709133

Views: 1

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注