Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

0

硅谷创新企业Groq的CEO Jonathan Ross近日在接受媒体采访时透露,预计到2024年前,大多数人工智能(AI)创业公司将会采用速度更快的LPU(Language Processing Unit)芯片。Groq公司正致力于研发这种专为大语言模型推理设计的新型AI芯片,旨在解决目前AI推理过程中的效率和成本问题。

Ross展示了一款由Groq技术驱动的音频聊天机器人,该机器人以破纪录的响应速度引起了关注。他指出,目前AI推理的高昂成本是许多初创公司面临的一大挑战。Groq的LPU芯片则提供了“超快”且更经济的解决方案,为大模型的运行提供高效支持。

“我们致力于为初创公司提供更友好的基础设施,”Ross表示,“到今年年底,Groq很可能成为这些公司首选的芯片供应商。我们的产品不仅性能卓越,而且价格极具竞争力。”Ross的这一言论预示着Groq的LPU芯片有可能重塑AI行业的计算格局,为众多AI创业公司带来技术革新和成本优势。

英语如下:

News Title: “Groq CEO Forecasts Most AI Startups to Adopt Faster LPU Chips by 2024”

Keywords: AI chips, LPU, startups

News Content: In a recent media interview, Jonathan Ross, CEO of Silicon Valley innovator Groq, predicted that the majority of artificial intelligence (AI) startups will adopt faster LPU (Language Processing Unit) chips before 2024. Groq is currently developing this novel AI chip, specifically designed for large language model inference, aiming to address efficiency and cost issues in the current AI inference process.

Ross showcased an audio chatbot powered by Groq’s technology, which drew attention for its record-breaking response speed. He highlighted the high cost of AI inference as a significant challenge for many startups. Groq’s LPU chips offer a “superfast” and more economical solution, providing efficient support for the operation of large models.

“We are dedicated to offering more accessible infrastructure to startups,” Ross said. “By the end of this year, Groq is likely to become the preferred chip supplier for these companies. Our products boast exceptional performance and are competitively priced.” Ross’s statement suggests that Groq’s LPU chips could potentially reshape the computing landscape in the AI industry, bringing technological innovation and cost advantages to numerous AI startups.

【来源】https://wallstreetcn.com/articles/3709133

Views: 1

0

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注