近日,Mistral AI公司宣布其Mistral 7B v0.2基模型正式开源,并在Cerebral Valley黑客松活动中公布。这一更新标志着Mistral-7B-Instruct-v0.2背后的原始预训练模型正式对外开放。新版本的主要更新包括将上下文长度从8K提升至32K,设定Rope Theta值为1e6,并取消了滑动窗口功能。这些改进旨在提升模型的性能和用户体验。Mistral AI的这一举措不仅展示了公司在AI领域的技术实力,也为全球开发者提供了宝贵的资源。
Title: Mistral 7B v0.2 Base Model Open-Sourced with 32K Context Support
Keywords: AI model update, context expansion, open-source project
News content:
Mistral AI recently announced the open-sourcing of its Mistral 7B v0.2 base model, unveiling it at the Cerebral Valley hackathon. This release includes the original pre-trained model behind Mistral-7B-Instruct-v0.2, part of the company’s “Mistral Tiny” series. The significant updates in this version include extending the context from 8K to 32K, setting Rope Theta to 1e6, and removing the sliding window feature. These enhancements are aimed at improving the model’s performance and user experience. Mistral AI’s decision to open-source this model not only showcases its technical capabilities in the AI field but also provides a valuable resource to developers worldwide.
【来源】https://mp.weixin.qq.com/s/R56Ob5dZjMh1alhMin8DZw
Views: 5