时间戳
计算机科学
时间序列
代表(政治)
机器学习
标准时间
变压器
系列(地层学)
人工智能
特征学习
光学(聚焦)
数据挖掘
实时计算
工程类
古生物学
物理
电气工程
光学
天文
电压
政治
政治学
法学
生物
作者
Seonmin Kim,Dong‐Kyu Chae
标识
DOI:10.1145/3539618.3592013
摘要
Time-series forecasting has been actively studied and adopted in various real-world domains. Recently there have been two research mainstreams in this area: building Transformer-based architectures such as Informer, Autoformer and Reformer, and developing time-series representation learning frameworks based on contrastive learning such as TS2Vec and CoST. Both efforts have greatly improved the performance of time series forecasting. In this paper, we investigate a novel direction towards improving the forecasting performance even more, which is orthogonal to the aforementioned mainstreams as a model-agnostic scheme. We focus on time stamp embeddings that has been less-focused in the literature. Our idea is simple-yet-effective: based on given current time stamp, we predict embeddings of its near future time stamp and utilize the predicted embeddings in the time-series (value) forecasting task. We believe that if such future time information can be previewed at the time of prediction, they can be utilized by any time-series forecasting models as useful additional information. Our experimental results confirmed that our method consistently and significantly improves the accuracy of the recent Transformer-based models and time-series representation learning frameworks. Our code is available at: https://github.com/sunsunmin/Look_Ahead
科研通智能强力驱动
Strongly Powered by AbleSci AI