【硅谷创新】Groq CEO Jonathan Ross近期在接受采访时透露,预计在2024年前,多数人工智能初创公司将采用速度更快的LPU(Language Processing Unit)芯片。Groq公司,这家位于硅谷的科技先驱,正在研发专为大语言模型推理设计的新一代AI芯片,以解决目前AI推理过程中的速度与成本问题。

Ross先生演示了由Groq技术驱动的音频聊天机器人,其响应速度创下了新高,展现了LPU芯片的卓越性能。他指出,目前AI推理的费用是许多初创公司面临的挑战,而Groq的解决方案正是提供“超快速”且经济实惠的LPU,以满足这些公司的需求。

“我们相信,到今年年底,Groq将成为许多初创公司首选的基础设施伙伴,”Ross表示,“我们的定价策略非常亲民,旨在帮助这些创新型企业以更低的成本实现更高效的AI应用。”这一声明预示着AI行业的芯片技术将迎来新的变革,为未来的智能服务提供更强的底层支持。来源:华尔街见闻。

英语如下:

News Title: “Groq CEO Forecasts: Most AI Startups to Embrace High-Speed LPU Tech by 2024”

Keywords: AI chips, LPU acceleration, startup favorite

News Content: [Silicon Valley Innovation] In a recent interview, Groq CEO Jonathan Ross divulged that he expects the majority of AI startups to adopt faster LPU (Language Processing Unit) chips before 2024. Groq, a pioneering tech company in Silicon Valley, is currently developing a new generation of AI chips tailored for large language model inference, addressing speed and cost issues in the AI inference process.

Ross showcased a voice chatbot powered by Groq’s technology, setting a new benchmark for response times, demonstrating the exceptional performance of LPU chips. He highlighted that the cost of AI inference is a challenge for many startups, and Groq’s solution offers “ultra-fast” and cost-effective LPUs to cater to their needs.

“We believe that by the end of this year, Groq will become the infrastructure partner of choice for many startups,” Ross said. “Our pricing strategy is very accessible, designed to help these innovative companies achieve more efficient AI applications at a lower cost.” This announcement signals a potential shift in the AI industry’s chip technology, paving the way for stronger foundational support for future intelligent services. Source: Wall Street Journal.

【来源】https://wallstreetcn.com/articles/3709133

Views: 1

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注