Subtitle: Technological Growth Feasible but Uncertain, Says Foresight News
September 2, 2024
Artificial Intelligence (AI) has been advancing at a breakneck speed, and a new report suggests that the trajectory is only set to accelerate. According to Foresight News, by 2030, AI models could be expanded by a staggering 10,000 times their current capabilities. The report, compiled by MetaverseHub and authored by Jason Dorrier, highlights the exponential growth in AI capabilities over the past few years, primarily attributed to the scaling of algorithms and models.
The Power of Scale
The report notes that the recent progress in AI can be largely credited to the simple concept of scale. Since the beginning of the 21st century, AI laboratories have observed that continually expanding the size of algorithms or models and providing them with more data significantly enhances their performance. The latest batch of AI models boasts billions to over a trillion internal network connections, learning to write code like humans by consuming vast amounts of internet resources.
Computation Power Surges
To train larger algorithms, more computing power is required. According to data from the non-profit AI research institution EpochAI, the computing power dedicated to AI training has been doubling annually. If this growth continues, future AI models will possess 10,000 times the computing power of today’s most advanced algorithms, such as OpenAI’s GPT-4.
Challenges Ahead
However, the report cautions that while maintaining this growth is technically feasible, it is not guaranteed. Epoch outlines four key constraints to AI scaling: power, chips, data, and latency.
1. Power Consumption
Training a cutting-edge AI model by 2030 could require 200 times the electricity of today, which is approximately equivalent to the annual power consumption of 23,000 American households. This surge in power demand poses significant challenges, as very few power plants can supply such amounts of electricity. Moreover, most power plants are likely to have long-term contracts. However, Epoch suggests that companies will seek locations where they can draw power from multiple power plants through the local grid. While this approach is challenging, it is still possible, especially considering planned utility growth.
2. Chip Supply
All the electricity is used to run AI chips, which are essential for training new models. The report focuses on the production of these chips, particularly the graphics processing units (GPUs) used by AI laboratories, with Nvidia leading the market. These chips are manufactured by Taiwan Semiconductor Manufacturing Company (TSMC) and paired with high-bandwidth memory. The report predicts that there could be between 20 million to 400 million AI chips available for training by 2030, allowing for the training of models with computing power about 50,000 times greater than GPT-4.
3. Data Availability
AI’s insatiable demand for data is well-known, and the report acknowledges the upcoming scarcity of high-quality training data as a significant constraint. While some predict that the supply of high-quality public data streams will dry up by 2026, Epoch believes that data scarcity will not hinder model development before 2030. The report suggests that the integration of non-text data, such as images, audio, and video, can expand the supply of text data and enhance model capabilities.
4. Latency
The report also considers the issue of latency, or the delay in processing data. As models grow larger, the time it takes to train and deploy them increases. However, with advancements in technology, such as the use of high-bandwidth fiber connections and distributed data centers, the report suggests that this challenge can be mitigated.
Conclusion
In conclusion, while the report by Foresight News and EpochAI paints an optimistic picture of AI’s future, it also underscores the significant challenges that must be overcome. The growth in AI capabilities is technically feasible, but it is by no means certain. The report serves as a call to action for the industry to address these constraints and continue pushing the boundaries of what AI can achieve.
Views: 0