硅谷创新企业Groq的首席执行官Jonathan Ross近日在接受采访时透露,预计到2024年,大部分人工智能(AI)创业公司将会采用速度更快的专用芯片——大语言模型推理处理器(LPU)。Groq公司正致力于研发这种新型LPU,以优化大语言模型的决策和预测过程,而不仅仅局限于训练阶段。
在展示由Groq芯片驱动的音频聊天机器人时,Ross自豪地表示,该机器人的响应速度创下了新的纪录,这标志着AI推理效率的一大飞跃。当前,AI推理的高成本是业界面临的一大挑战,而Groq的解决方案旨在提供“超快”且成本更低的芯片,以满足初创公司对高效能、低成本计算的需求。
Ross坚信,Groq的技术将在市场上占据重要地位,他预测:“到今年年底,Groq的LPU很可能成为众多初创公司基础设施的一部分,我们的定价策略对初创公司非常具有吸引力。”这一声明预示着Groq的LPU技术可能引领AI行业的下一个变革,为众多寻求性能提升和成本优化的AI创业公司打开新的可能。
英语如下:
**News Title:** “Groq CEO Forecasts Most AI Startups to Adopt Faster LPU Chips by 2024”
**Keywords:** Groq, LPU, AI Chips
**News Content:** In a recent interview, Jonathan Ross, CEO of Silicon Valley innovator Groq, revealed that he expects the majority of artificial intelligence (AI) startups to adopt faster, specialized chips known as Large Language Model Inference Processors (LPUs) by 2024. Groq is currently dedicated to developing this新型 LPU to optimize decision-making and predictive processes for large language models, extending beyond the training phase.
Demonstrating an audio chatbot powered by Groq’s chip, Ross proudly announced that the robot’s response time set a new record, marking a significant leap in AI inference efficiency. High costs associated with AI inference currently pose a major challenge to the industry, and Groq’s solution aims to offer “ultra-fast” chips at a lower cost, catering to startups’ demands for high-performance, cost-effective computing.
Confident in Groq’s technology, Ross predicts, “By the end of this year, Groq’s LPU is likely to become a part of many startups’ infrastructure. Our pricing strategy is very attractive to startups.” This statement suggests that Groq’s LPU technology could spearhead the next transformation in the AI industry, opening new possibilities for startups seeking performance enhancements and cost optimization.
【来源】https://wallstreetcn.com/articles/3709133
Views: 1