计算机科学
学习迁移
人工智能
脑电图
二元分类
模式识别(心理学)
自编码
一般化
代表(政治)
领域(数学分析)
机器学习
情绪分类
人工神经网络
支持向量机
数学
法学
精神科
数学分析
政治
政治学
心理学
作者
Jie Quan,Ying Li,Lingyue Wang,Renjie He,Shuo Yang,Lei Guo
标识
DOI:10.1016/j.bspc.2023.104741
摘要
Emotion recognition based on electroencephalogram (EEG) has received extensive attention due to its advantages of being objective and not being controlled by subjective consciousness. However, inter-individual differences lead to insufficient generalization of the model on cross-subject recognition tasks. To solve this problem, a cross-subject emotional EEG classification algorithm based on multi-source domain selection and subdomain adaptation is proposed in this paper. We firstly design a multi-representation variational autoencoder (MR-VAE) to automatically extract emotion related features from multi-channel EEG to obtain a consistent EEG representation with as little prior knowledge as possible. Then, a multi-source domain selection algorithm is proposed to select the existing subjects' EEG data that is closest to the target data distribution in the global distribution and sub-domain distribution, thereby improving the performance of the transfer learning model on the target subject. In this paper, we use a small amount of annotated target data to achieve knowledge transfer and improve the classification accuracy of the model on the target subject as much as possible, which has certain significance in clinical research. The proposed method achieves an average classification accuracy of 92.83% and 79.30% in our experiment on two public datasets SEED and SEED-IV, respectively, which are 26.37% and 22.80% higher than the baseline non-transfer learning method, respectively. Furthermore, we validate the proposed method on other two commonly used public datasets DEAP and DREAMER, which establish SOTA results on the binary classification task of the DEAP dataset. It also achieves comparable accuracy to several transfer learning based methods on the DREAMER dataset. The detailed recognition results on DEAP and DREAMER are in Appendix.
科研通智能强力驱动
Strongly Powered by AbleSci AI