Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

news studionews studio
0

时序预测迈入新纪元:华人团队打造十亿参数大模型Time-MoE

全球首次突破十亿参数,华人团队 Time-MoE 掀起时序预测革命

在以数据为驱动的时代,时序预测已成为众多领域的核心组成部分,从金融市场预测到能源管理,从电商销量预测到气象预报,时序预测无处不在。然而,构建一个兼具强大性能与高效运算的大规模时序预测模型始终是一个巨大的挑战。近日,由来自普林斯顿大学、格里菲斯大学等全球多地的华人国际科研团队携手合作,创新性地提出了一种基于混合专家架构(Mixture of Experts, MoE)的时间序列基础模型 Time-MoE,首次将时间序列预训练大模型的参数规模推向十亿级别,在时序预测领域实现了里程碑式的突破。

Time-MoE:突破传统,引领时序预测新时代

Time-MoE 采用创新的混合专家架构,将模型参数成功扩展至 24 亿,不仅显著提升了预测精度,还在降低计算成本的同时超越了众多现有模型,全面达到了 SOTA(State of the Art)水平。

关键技术突破:

  • 强大的混合专家架构:Time-MoE 采用稀疏激活机制,在预测任务中仅激活部分网络节点,这不仅确保了高预测精度,还显著降低了计算负担,完美解决了时序大模型在推理阶段的计算瓶颈。
  • 灵活的预测范围:Time-MoE 支持任意长度的输入和输出范围,能够处理从短期到长期的各种时序预测任务,实现了真正的全域时序预测。
  • 全球最大规模的开源时序数据集:团队开发了 Time-300B 数据集,涵盖 9 个领域的超过 3000 亿个时间点,为模型提供了丰富的多领域训练数据,确保其在多种任务中的卓越泛化能力。

实验结果:

  • 零样本预测:Time-MoE在零样本预测中展现出卓越的泛化能力,均方误差(MSE)降低了约 20%。
  • 全样本预测:Time-MoE 在全样本预测中依然能达到最优的效果,MSE 降低了约 24%,展现了其对于不同领域数据的适用性。
  • 消融实验:消融实验表明,Time-MoE 的设计在提升模型精度上是有效的,特别是混合专家架构对于模型性能提升至关重要。
  • 可扩展性分析:Time-MoE 在数据量和模型参数增大时持续表现出稳定的性能提升,并且与同规模的稠密模型相比,总能达到更小的 MSE 和更好的预测性能。

Time-MoE 的意义:

Time-MoE 的成功标志着时序预测领域迈入了一个全新时代。它不仅在性能上全面超越了现有模型,更为构建大规模、高效、通用的时序预测基础模型奠定了一个可行的范式。Time-MoE 的发布不仅为学术界开辟了全新的研究方向,也为工业界的多种时序应用场景注入了巨大的潜力。

未来展望:

Time-MoE 的出现将为能源管理、金融预测、电商销量、气象预报等众多关键领域带来革命性的变革。随着研究的不断深入,Time-MoE 将进一步提升性能,并应用于更广泛的领域,为人类社会带来更多益处。

论文链接:https://arxiv.org/pdf/2409.16040

代码链接: https://github.com/Time-MoE/Time-MoE

参考文献:

[1] Foundation Models for Time Series Analysis: A Tutorial and Survey, KDD 2024. https://arxiv.org/abs/2403.14735

[2] Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook, arXiv 2023 https://arxiv.org/abs/2310.10196

[3] Position: What Can Large Language Models Tell Us about Time Series Analysis, ICML 2024. https://arxiv.org/abs/2402.02713

[4] Time-LLM: Time Series Forecasting by Reprogramming Large Language Models, ICLR 2024. https://arxiv.org/abs/2310.01728

[5] TimeMixer: Decomposable Multiscale Mixing


>>> Read more <<<

Views: 0

0

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注