计算机科学
人工智能
联营
深度学习
卷积神经网络
编码器
变压器
模式识别(心理学)
人工神经网络
分类器(UML)
嵌入
机器学习
监督学习
语音识别
电压
物理
量子力学
操作系统
作者
Weisi Kong,Xun Jiao,Yuhua Xu,Bolin Zhang,Qinghai Yang
出处
期刊:IEEE Transactions on Cognitive Communications and Networking
[Institute of Electrical and Electronics Engineers]
日期:2023-04-05
卷期号:9 (4): 950-962
被引量:9
标识
DOI:10.1109/tccn.2023.3264908
摘要
The application of deep learning improves the processing speed and the accuracy of automatic modulation recognition (AMR). As a result, it realizes intelligent spectrum management and electronic reconnaissance. However, deep learning-aided AMR usually requires a large number of labeled samples to obtain a reliable neural network model. In practical applications, due to economic costs and privacy constraints, there is a small number of labeled samples but a large number of unlabeled samples. This paper proposes a Transformer-based contrastive semi-supervised learning framework for AMR. First, self-supervised contrastive pre-training of the Transformer-based encoder is completed using unlabeled samples, and data augmentation is realized through time warping. Then, the pre-trained encoder and a randomly initialized classifier are fine-tuned using labeled samples, and hierarchical learning rates are employed to ensure classification accuracy. Considering the problems of applying Transformer to AMR, a convolutional transformer deep neural network is proposed, which involves convolutional embedding, attention bias, and attention pooling. In experiments, the feasibility of the framework is analyzed through linear evaluation of the framework components on the RML2016.10a dataset. Also, the proposed framework is compared with existing semi-supervised methods on RML2016.10a and RML2016.10b datasets to verify its superiority and stability.
科研通智能强力驱动
Strongly Powered by AbleSci AI