多元统计
计算机科学
机器学习
无监督学习
人工智能
回归
缺少数据
变压器
插补(统计学)
特征学习
数据挖掘
模式识别(心理学)
统计
数学
工程类
电气工程
电压
作者
George Zerveas,Srideepika Jayaraman,Dhaval Patel,Anuradha Bhamidipaty,Carsten Eickhoff
出处
期刊:Cornell University - arXiv
日期:2020-01-01
被引量:4
标识
DOI:10.48550/arxiv.2010.02803
摘要
In this work we propose for the first time a transformer-based framework for unsupervised representation learning of multivariate time series. Pre-trained models can be potentially used for downstream tasks such as regression and classification, forecasting and missing value imputation. By evaluating our models on several benchmark datasets for multivariate time series regression and classification, we show that not only does our modeling approach represent the most successful method employing unsupervised learning of multivariate time series presented to date, but also that it exceeds the current state-of-the-art performance of supervised methods; it does so even when the number of training samples is very limited, while offering computational efficiency. Finally, we demonstrate that unsupervised pre-training of our transformer models offers a substantial performance benefit over fully supervised learning, even without leveraging additional unlabeled data, i.e., by reusing the same data samples through the unsupervised objective.
科研通智能强力驱动
Strongly Powered by AbleSci AI