Distillate a Sparse-Meta Time Series Classifier for Open Radio Access Network-Based Cellular Vehicle-to-Everything
计算机科学
遗忘
人工智能
学习迁移
机器学习
哲学
语言学
作者
Le Sun,Jiancong Liang,Ghulam Muhammad
出处
期刊:IEEE Transactions on Vehicular Technology [Institute of Electrical and Electronics Engineers] 日期:2023-10-10卷期号:73 (7): 9262-9271被引量:2
标识
DOI:10.1109/tvt.2023.3323279
摘要
Deep learning-based univariate time series classification can improve the user experience of Open Radio Access Network (RAN)-based Cellular Vehicle-to-Everything (CV2x). However,few institutes researching ORAD-based CV2x can satisfy the enormous demand of labeled data. This issue is known as few-shot learning. Thus, we deeply explore the issue of few-shot learning for ORAE-based CV2x. Meta-transfer learning is a good alternative to solving few-shot learning. Most of them, however, are still plagued by catastrophic forgetting. Numerous studies have demonstrated that deliberately applying gradient sparsity can significantly increase a meta-model's capacity for generalization. In this paper, we propose a pre-training framework named Distilling for Sparse-Meta-transfer Learning (DSML). It is a combination and enhancement of meta-transfer learning, multi-teacher knowledge distillation, and sparse Model-Agnostic Meta-Learning (sparse-MAML). It utilizes multi-teacher knowledge distillation to address the catastrophic forgetting in the meta-learning phase. Simultaneously, it utilizes sigmoid function to fundamentally address the gradient anomaly problem of sparse-MAML. We conducted ablation experiments on Sparse-MAML and prove that it can actually increase the meta-model's generalization capacity. We also compare DSML with the state-of-the-art algorithm in the univariate time series classification field. The results demonstrate that DSML performs better. Finally, we present two case studies of applying DSML to ORAN-based CV2x.