计算机科学
判别式
人工智能
脑电图
特征提取
特征(语言学)
模式识别(心理学)
背景(考古学)
光学(聚焦)
情态动词
眼球运动
语音识别
心理学
古生物学
语言学
哲学
光学
生物
物理
化学
精神科
高分子化学
作者
Zhiyi Yang,D. Li,Fazheng Hou,Yu Song,Yu Song
出处
期刊:IEEE Transactions on Circuits and Systems Ii-express Briefs
[Institute of Electrical and Electronics Engineers]
日期:2024-03-01
卷期号:71 (3): 1526-1530
被引量:1
标识
DOI:10.1109/tcsii.2023.3318814
摘要
Recently, electroencephalogram (EEG)-based multimodal emotion recognition has emerged as one of the research hotspots in affective computing. However, the existing methods tend to ignore the interaction information between the EEG and other modal features. In this brief, we propose a novel model termed EEANet (EEG and eye movement Attention Network) to find the modal correlation at feature level. The DE feature and 31 eye movement features were extracted from the pre-processed EEG and eye movement signals, and then two feedforward encoders were used to capture the deep features, respectively. The interactive attention layer is applied to learn multi-modal complementary information and semantic-level context information. Finally, the multi-head self-attention mechanism allows the model to focus on the discriminative features for emotion classification. The model was verified on the SEED-IV dataset, and the results showed that the accuracy of emotion recognition was significantly improved with the EEANet, and the average accuracy of the four classifications was 92.26%.
科研通智能强力驱动
Strongly Powered by AbleSci AI