多元统计
计算机科学
人工智能
系列(地层学)
时间序列
多元分析
数据挖掘
机器学习
地质学
古生物学
作者
Zhiwen Xiao,Huanlai Xing,Rong Qu,Li Feng,Shouxi Luo,Penglin Dai,Bowen Zhao,Yuanshun Dai
出处
期刊:IEEE transactions on systems, man, and cybernetics
[Institute of Electrical and Electronics Engineers]
日期:2024-01-09
卷期号:54 (4): 2192-2204
被引量:59
标识
DOI:10.1109/tsmc.2023.3342640
摘要
Multivariate time series classification (MTSC) based on deep learning (DL) has attracted increasingly more research attention. The performance of a DL-based MTSC algorithm is heavily dependent on the quality of the learned representations providing semantic information for downstream tasks, e.g., classification. Hence, a model's representation learning ability is critical for enhancing its performance. This article proposes a densely knowledge-aware network (DKN) for MTSC. The DKN's feature extractor consists of a residual multihead convolutional network (ResMulti) and a transformer-based network (Trans), called ResMulti-Trans. ResMulti has five residual multihead blocks for capturing the local patterns of data while Trans has three transformer blocks for extracting the global patterns of data. Besides, to enable dense mutual supervision between lower-and higher-level semantic information, this article adapts densely dual self-distillation (DDSD) for mining rich regularizations and relationships hidden in the data. Experimental results show that compared with 5 state-of-the-art self-distillation variants, the proposed DDSD obtains 13/4/13 in terms of "win"/"tie"/"lose" and gains the lowest-AVG_rank score. In particular, compared with pure ResMulti-Trans, DKN results in 20/1/9 regarding win/tie/lose. Last but not least, DKN overweighs 18 existing MTSC algorithms on 10 UEA2018 datasets and achieves the lowest-AVG_rank score.
科研通智能强力驱动
Strongly Powered by AbleSci AI