硅谷AI芯片公司Groq的创始人兼CEO Jonathan Ross在最近的一次采访中透露,该公司正在开发一种名为LPU的新型人工智能芯片,专为大语言模型推理设计。与传统的AI芯片相比,LPU能够以更快的速度处理数据,大大降低了AI推理的成本。Ross表示,随着技术的不断进步,Groq的芯片将成为大多数初创公司首选的基础设施,其价格对初创公司非常友好。
Groq提供动力的音频聊天机器人展示了其芯片的超快速度,打破了现有记录。这一突破将有助于推动人工智能的发展,特别是在大语言模型领域。Ross认为,到2024年前,大多数AI创业公司将使用这种速度更快的LPU。
英文标题:Groq CEO Predicts Most AI Startups to Adopt Faster LPUs by 2024
英文关键词:AI Chip, Groq, Large Language Models
英文新闻内容:
The founder and CEO of Silicon Valley-based AI chip company Groq, Jonathan Ross, recently shared that Groq is developing a new type of AI chip called the LPU, which is specifically designed for the inference of large language models. Compared to traditional AI chips, LPUs can process data at a much faster rate, significantly reducing the cost of AI inference. Ross believes that as technology advances, Groq’s chips will become the infrastructure of choice for most startups, offering a very competitive price point.
A demonstration of the speed of Groq-powered audio chatbots has broken existing records, highlighting the potential of the company’s technology. This breakthrough is expected to accelerate the development of AI, particularly in the field of large language models. Ross predicts that by 2024, the majority of AI startups will be using these faster LPUs.
【来源】https://wallstreetcn.com/articles/3709133
Views: 1