亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

AC-CfC: An attention-based convolutional closed-form continuous-time neural network for raw multi-channel EEG-based emotion recognition

脑电图 卷积神经网络 计算机科学 频道(广播) 语音识别 模式识别(心理学) 人工智能 心理学 神经科学 电信
作者
Yiwu Wang,Yingyue Zhou,Weikun Lu,Qiao Wu,Qiang Li,Runfeng Zhang
出处
期刊:Biomedical Signal Processing and Control [Elsevier]
卷期号:94: 106249-106249 被引量:6
标识
DOI:10.1016/j.bspc.2024.106249
摘要

Emotion recognition based on electroencephalogram (EEG) is a critical task in the field of affective brain-computer interfaces. However, due to the non-stationarity and individual variability of EEG, hand-designed features cannot adequately capture the nonlinear and high-dimensional properties of raw EEG. Currently, spatial–temporal models have been verified to effectively capture the spatial–temporal information for EEG. However, these models are characterized by complex structures, large parameter counts, and the need for extensive training data. To overcome these disadvantages, this paper propose an end-to-end spatial–temporal model for emotion recognition based on raw EEG, called attention-based convolutional closed-form continuous-time neural network (AC-CfC). This model employs a channel attention mechanism to weight and encode EEG, capturing channel dependencies. Subsequently, one-dimensional convolutional neural networks and closed-form continuous-time neural networks are used to extract deep spatial–temporal features. Additionally, an adaptive loss-controlling mechanism is designed to enhance the model's decision-making ability between classes that are easily confused. To verify the effectiveness of the proposed model, experiments are conducted on the DEAP and DREAMER datasets. The average accuracies of the proposed model reach to 94.76% and 93.01% for valence and arousal in subject-independent experiments on the DEAP dataset, with an improvement of 11.8% and 8.73% respectively compared with the state-of-art model ACRNN. On the DREAMER dataset, the average accuracies reach to 81.83%, 81.22%, and 80.63% for valence, arousal and dominance, with an improvement of 2.55%, 6.64%, and 6.98% respectively over ACRNN. These results show that our proposed model exhibits better performance than some state-of-art models in the same category.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
zpli完成签到 ,获得积分10
9秒前
12秒前
小马甲应助wannada采纳,获得10
19秒前
Pearl发布了新的文献求助10
37秒前
科研通AI2S应助醉熏的井采纳,获得10
40秒前
45秒前
47秒前
48秒前
言笑自若发布了新的文献求助10
53秒前
56秒前
南陆赏降英完成签到,获得积分10
1分钟前
1分钟前
1分钟前
zsxhy2发布了新的文献求助10
1分钟前
1分钟前
言笑自若完成签到,获得积分10
1分钟前
一一发布了新的文献求助10
1分钟前
一一完成签到,获得积分10
1分钟前
彭于晏应助vivid采纳,获得10
1分钟前
斯文败类应助Snieno采纳,获得10
2分钟前
切菜的猪完成签到,获得积分10
2分钟前
搜集达人应助小白采纳,获得10
2分钟前
2分钟前
2分钟前
vivid发布了新的文献求助10
2分钟前
2分钟前
vivid完成签到,获得积分10
2分钟前
2分钟前
wannada发布了新的文献求助10
2分钟前
斯文败类应助科研通管家采纳,获得10
2分钟前
Akim应助科研通管家采纳,获得10
2分钟前
lufei完成签到 ,获得积分10
2分钟前
慕青应助BakerStreet采纳,获得10
3分钟前
4分钟前
Snieno发布了新的文献求助10
4分钟前
Snieno完成签到,获得积分10
4分钟前
fveie完成签到 ,获得积分10
5分钟前
5分钟前
5分钟前
李爱国应助xiaoxinbaba采纳,获得10
5分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Modern Epidemiology, Fourth Edition 5000
Kinesiophobia : a new view of chronic pain behavior 5000
Molecular Biology of Cancer: Mechanisms, Targets, and Therapeutics 3000
Digital Twins of Advanced Materials Processing 2000
Propeller Design 2000
Weaponeering, Fourth Edition – Two Volume SET 2000
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 纳米技术 化学工程 生物化学 物理 计算机科学 内科学 复合材料 催化作用 物理化学 光电子学 电极 冶金 细胞生物学 基因
热门帖子
关注 科研通微信公众号,转发送积分 6012611
求助须知:如何正确求助?哪些是违规求助? 7571859
关于积分的说明 16139278
捐赠科研通 5159672
什么是DOI,文献DOI怎么找? 2763173
邀请新用户注册赠送积分活动 1742492
关于科研通互助平台的介绍 1634057