计算机科学
单变量
人工智能
代表(政治)
机器学习
混合(物理)
无监督学习
平滑的
特征学习
系列(地层学)
钥匙(锁)
学习迁移
时间序列
组分(热力学)
监督学习
模式识别(心理学)
多元统计
人工神经网络
古生物学
物理
计算机安全
量子力学
政治
政治学
法学
计算机视觉
生物
热力学
作者
Kristoffer Wickstrøm,Michael Kampffmeyer,Karl Øyvind Mikalsen,Robert Jenssen
标识
DOI:10.1016/j.patrec.2022.02.007
摘要
The lack of labeled data is a key challenge for learning useful representation from time series data. However, an unsupervised representation framework that is capable of producing high quality representations could be of great value. It is key to enabling transfer learning, which is especially beneficial for medical applications, where there is an abundance of data but labeling is costly and time consuming. We propose an unsupervised contrastive learning framework that is motivated from the perspective of label smoothing. The proposed approach uses a novel contrastive loss that naturally exploits a data augmentation scheme in which new samples are generated by mixing two data samples with a mixing component. The task in the proposed framework is to predict the mixing component, which is utilized as soft targets in the loss function. Experiments demonstrate the framework's superior performance compared to other representation learning approaches on both univariate and multivariate time series and illustrate its benefits for transfer learning for clinical time series.
科研通智能强力驱动
Strongly Powered by AbleSci AI