计算机科学
循环神经网络
时间戳
人工智能
变压器
卷积神经网络
遗忘
模式识别(心理学)
特征提取
时间序列
深度学习
机器学习
人工神经网络
实时计算
工程类
语言学
哲学
电压
电气工程
作者
Huiling Chen,Aosheng Tian,Ye Zhang,Yuzi Liu
标识
DOI:10.1109/iccasit55263.2022.9986835
摘要
Early time series classification is of great significance for time-sensitive applications such as fault detection and earthquake prediction. This task aims to classify time series with the least timestamps at desired accuracy. Recent deep learning methods usually used the Recurrent Neural Networks (RNNs) as the classification backbone and the exiting subnet for early quitting. However, the RNNs suffer from the 'forgetting' defect and insufficient local feature extraction. Besides, the balance between earliness and accuracy is not fully considered. In this paper, a framework named TCN-Transformer is proposed. To overcome the defects of RNNs, we combined Temporal Convolutional Network and Transformer to extract both local and global features. Then, a loss function is designed to ensure the classification performance, while focusing more on earlier features. The experimental results on ten univariate datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI