Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

0

硅谷创新企业Groq的创始人兼CEO Jonathan Ross近日表示,大多数人工智能(AI)创业公司有望在2024年前转向使用速度更快的专用芯片——大语言模型推理处理器(LPU)。Groq正致力于研发这种新型LPU,旨在优化大语言模型的决策和预测能力,而非用于训练模型。

在一次采访中,Ross向公众展示了Groq技术驱动的一款音频聊天机器人,该机器人以破纪录的响应速度引起了关注。他指出,目前AI推理过程的费用不菲,而Groq的LPU设计则旨在提供“超快”且成本更低的解决方案,以满足市场需求。

Ross自信地宣称:“到今年年底,我们预见Groq的LPU将成为许多初创公司的基础设施选择,尤其是考虑到我们的定价策略对初创企业极其友好。”这一预测表明,Groq正致力于打破AI技术的性能与成本壁垒,为行业的快速发展提供更为高效的动力支持。

这一发展对于AI初创公司来说是一大福音,他们可以借此提升服务质量,同时降低运营成本。Groq的创新可能会重塑AI市场的格局,推动更多公司采用先进、经济的硬件解决方案,以保持在竞争激烈的AI领域的领先地位。

英语如下:

**News Title:** “Groq CEO Forecasts Most AI Startups to Adopt Faster LPU Technology by 2024”

**Keywords:** Groq, LPU, AI startups

**News Content:**

### Groq CEO Predicts Most AI Startups to Embrace Faster LPUs by 2024

Jonathan Ross, founder and CEO of Silicon Valley innovator Groq, recently stated that the majority of artificial intelligence (AI) startups are expected to shift to using faster specialized chips, known as Large Language Model Inference Processors (LPUs), before 2024. Groq is currently developing this novel LPU technology, designed to enhance the decision-making and predictive capabilities of large language models, rather than for model training.

In an interview, Ross showcased a voice chatbot powered by Groq’s technology, which garnered attention for its record-breaking response speed. He highlighted that the current costs associated with AI inference processes are substantial, and Groq’s LPU design aims to offer a “superfast” and more cost-effective solution to meet market demands.

Ross confidently asserted, “By the end of this year, we envision Groq’s LPU becoming a foundational infrastructure choice for many startups, particularly given our pricing strategy that is extremely startup-friendly.” This forecast suggests that Groq is dedicated to breaking down the barriers between AI performance and cost, providing a more efficient power source for the industry’s rapid growth.

This development is a boon for AI startups, enabling them to improve service quality while reducing operational costs. Groq’s innovation could reshape the AI market landscape, driving more companies to adopt advanced and cost-effective hardware solutions to maintain a competitive edge in the fiercely contested AI sector.

【来源】https://wallstreetcn.com/articles/3709133

Views: 1

0

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注