培训(气象学)
系列(地层学)
变压器
计算机科学
时间序列
人工智能
工业工程
机器学习
工程类
电气工程
气象学
地理
地质学
古生物学
电压
作者
Gerald Woo,Chenghao Liu,Ajay Kumar,Caiming Xiong,Silvio Savarese,Debasish Sahoo
出处
期刊:Cornell University - arXiv
日期:2024-02-04
标识
DOI:10.48550/arxiv.2402.02592
摘要
Deep learning for time series forecasting has traditionally operated within a one-model-per-dataset framework, limiting its potential to leverage the game-changing impact of large pre-trained models. The concept of universal forecasting, emerging from pre-training on a vast collection of time series datasets, envisions a single Large Time Series Model capable of addressing diverse downstream forecasting tasks. However, constructing such a model poses unique challenges specific to time series data: i) cross-frequency learning, ii) accommodating an arbitrary number of variates for multivariate time series, and iii) addressing the varying distributional properties inherent in large-scale data. To address these challenges, we present novel enhancements to the conventional time series Transformer architecture, resulting in our proposed Masked Encoder-based Universal Time Series Forecasting Transformer (Moirai). Trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains, Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models. Code, model weights, and data will be released.
科研通智能强力驱动
Strongly Powered by AbleSci AI