Beijing – The open-source landscape for AI model optimization is becoming increasingly crowded, with competition heating up between leading Chinese AI companies. Moonshot AI, the company behind the popular Kimi chatbot, has released an upgraded version of its Muon optimizer, claiming it can achieve twice the efficiency while using half the computing power compared to the widely-used AdamW optimizer.
This move comes shortly after DeepSeek, another prominent AI firm, announced plans to open-source five code repositories this week. The timing suggests a potential rivalry between the two companies, reminiscent of their previous near-simultaneous releases of improved attention mechanisms, as reported by Machine Heart.
The original Muon optimizer was known for its effectiveness in training smaller language models. However, its scalability to larger models remained a question. To address this, Moonshot AI’s team focused on two key improvements:
- Adding weight decay: This is crucial for scaling the optimizer to handle larger, more complex models.
- Consistent RMS updates: This ensures consistency in the root mean square (RMS) of model updates.
These enhancements allow Muon to be used directly in large-scale training without requiring extensive hyperparameter tuning. According to Moonshot AI’s scaling law experiments, the improved Muon optimizer achieves a two-fold increase in computational efficiency compared to the computationally optimal AdamW optimizer.
The release of Moonshot AI’s optimized Muon is a significant development for the AI community. By open-sourcing this technology, Moonshot AI is contributing to the advancement of more efficient and accessible AI model training. This could potentially lower the barrier to entry for researchers and developers working on large language models, enabling faster innovation and wider adoption of AI technologies.
Conclusion
Moonshot AI’s open-sourcing of its enhanced Muon optimizer highlights the increasing competition and innovation within China’s AI sector. The optimizer’s claimed ability to significantly reduce computational costs while improving efficiency could have a major impact on the development and deployment of large language models. As the open-source AI ecosystem continues to grow, we can expect to see further advancements in optimization techniques and other key areas of AI research.
References
- Machine Heart report on Moonshot AI’s Muon optimizer release: [Insert original article URL here]
- Machine Heart report on Moonshot AI and DeepSeek’s attention mechanism releases: [Insert original article URL here]
Views: 0