基础(证据)
系列(地层学)
计算机科学
地质学
历史
考古
古生物学
作者
Yong Liu,Qin Guo,Zhiyuan Shi,Zhi Chen,C. B. Yang,Xiangdong Huang,Jianmin Wang,Mingsheng Long
出处
期刊:Cornell University - arXiv
日期:2025-02-02
标识
DOI:10.48550/arxiv.2502.00816
摘要
We introduce Sundial, a family of native, flexible, and scalable time series foundation models. To predict the next-patch's distribution, we propose a TimeFlow Loss based on flow-matching, which facilitates native pre-training of Transformers on time series without discrete tokenization. Conditioned on arbitrary-length time series, our model is pre-trained without specifying any prior distribution and can generate multiple probable predictions, achieving flexibility in representation learning beyond using parametric densities. Towards time series foundation models, we leverage minimal but crucial adaptations of Transformers and curate TimeBench with 1 trillion time points, comprising mostly real-world datasets and synthetic data. By mitigating mode collapse through TimeFlow Loss, we pre-train a family of Sundial models on TimeBench, which exhibit unprecedented model capacity and generalization performance on zero-shot forecasting. In addition to presenting good scaling behavior, Sundial achieves new state-of-the-art on both point forecasting and probabilistic forecasting benchmarks. We believe that Sundial's pioneering generative paradigm will facilitate a wide variety of forecasting scenarios.
科研通智能强力驱动
Strongly Powered by AbleSci AI