硅谷初创公司Groq的首席执行官Jonathan Ross近期在接受采访时透露,预计大部分人工智能(AI)创业公司将在2024年前采用速度更快的专用芯片——大语言模型推理处理器(LPU)。Groq公司正致力于研发这种创新的AI芯片,以优化大语言模型的决策和预测能力,而不涉及模型的训练过程。

在展示由Groq LPU驱动的音频聊天机器人时,Ross强调了该技术的响应速度创下了新纪录,这表明LPU在提高AI性能方面具有显著优势。他指出,当前AI推理的高成本是许多公司面临的一大挑战,而Groq的解决方案旨在提供“超快”且成本更低的芯片,以满足市场需求。

Ross充满信心地表示,Groq的技术有望在年底前成为众多初创公司基础设施的一部分,其定价策略对资金有限的初创企业尤为友好。这一发展预示着AI行业的计算效率和经济效率将得到显著提升,可能引领新一轮的创新浪潮,为AI创业公司提供更强的竞争力。

英语如下:

News Title: “Groq CEO Forecasts Most AI Startups to Adopt Faster LPU Technology by 2024”

Keywords: AI chips, LPU, startups

News Content: In a recent interview, Jonathan Ross, CEO of Silicon Valley-based startup Groq, revealed that he expects the majority of artificial intelligence (AI) startups to adopt faster specialized chips, known as Large Language Model Inference Processors (LPUs), before 2024. Groq is currently dedicated to developing this innovative AI chip to optimize the decision-making and predictive capabilities of large language models without involving the model training process.

Demonstrating an audio chatbot powered by Groq’s LPU, Ross emphasized the record-breaking responsiveness, highlighting the significant advantages of LPUs in boosting AI performance. He pointed out that the high cost of AI inference currently poses a major challenge for many companies, and Groq’s solution aims to offer “ultra-fast” chips at a lower cost to meet market demands.

Ross confidently stated that Groq’s technology is on track to become a part of many startups’ infrastructure by the end of the year, with a pricing strategy particularly friendly to cash-strapped startups. This development promises a significant boost in computational efficiency and cost-effectiveness in the AI industry, potentially fueling a new wave of innovation and providing AI startups with a stronger competitive edge.

【来源】https://wallstreetcn.com/articles/3709133

Views: 1

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注