Spatio-temporal representation learning enhanced speech emotion recognition with multi-head attention mechanisms

计算机科学 人工智能 杠杆(统计) 特征学习 深度学习 话语 特征(语言学) 语音识别 卷积神经网络 帧(网络) 电信 哲学 语言学
作者
Zengzhao Chen,Mengting Lin,Zhifeng Wang,Qiuyu Zheng,Chuan Liu
出处
期刊:Knowledge Based Systems [Elsevier]
卷期号:281: 111077-111077 被引量:5
标识
DOI:10.1016/j.knosys.2023.111077
摘要

Speech emotion recognition (SER) systems have become essential in various fields, including intelligent healthcare, customer service, call centers, automatic translation systems, and human–computer interaction. However, current approaches predominantly rely on single frame-level or utterance-level features, offering only shallow or deep characterization, and fail to fully exploit the diverse types, levels, and scales of emotion features. The limited ability of single features to capture speech emotion information, along with the ineffective combination of different features' complementary advantages through simple fusion, pose significant challenges. To address these issues, this paper presents a novel spatio-temporal representation learning enhanced speech emotion recognition with multi-head attention mechanisms(STRL-SER). The proposed technique integrates fine-grained frame-level features and coarse-grained utterance-level emotion features, while employing separate modules to extract deep representations at different levels. In the frame-level module, we introduce parallel networks and utilize a bidirectional long short-term memory network (BiLSTM) and an attention-based multi-scale convolutional neural network (CNN) to capture the spatio-temporal representation details of diverse frame-level signals. Consequently, we extract deep representations of utterance-level features to effectively learn global speech emotion features. To leverage the advantages of different feature types, we introduce a multi-head attention mechanism that fuses the deep representations from various levels. This fusion approach retains the distinctive qualities of each feature type. Finally, we employ segment-level multiplexed decision making to generate the ultimate classification results. We evaluate the effectiveness of our proposed method on two widely recognized benchmark datasets: IEMOCAP and RAVDESS. The results demonstrate that our method achieves notable performance improvements compared to previous studies. On the IEMOCAP dataset, our method achieves a weighted accuracy (WA) of 81.60% and an unweighted accuracy (UA) of 79.32%. Similarly, on the RAVDESS dataset, we achieve a WA of 88.88% and a UA of 87.85%. These outcomes confirm the substantial advancements realized by our proposed method.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
几个柚几个柚完成签到,获得积分10
刚刚
Akim应助zouzailubian采纳,获得10
刚刚
gyh应助寒冷的曼易采纳,获得20
刚刚
胡志飞完成签到,获得积分20
1秒前
1秒前
Eggboy发布了新的文献求助10
1秒前
洪伟华完成签到,获得积分10
1秒前
XX完成签到,获得积分10
2秒前
肖鹏发布了新的文献求助10
2秒前
卷毛完成签到,获得积分10
2秒前
3秒前
yyuu发布了新的文献求助10
3秒前
4秒前
csy发布了新的文献求助10
4秒前
4秒前
脑洞疼应助王勾勾采纳,获得10
5秒前
小香菜完成签到 ,获得积分10
5秒前
wennuan0913发布了新的文献求助10
5秒前
CodeCraft应助林知鲸落采纳,获得10
6秒前
0128lun完成签到,获得积分10
7秒前
8秒前
852应助舒适的书雪采纳,获得10
8秒前
超级冰薇发布了新的文献求助10
8秒前
现代的bb发布了新的文献求助10
9秒前
FashionBoy应助几个柚几个柚采纳,获得10
9秒前
liciky发布了新的文献求助10
9秒前
张振宇完成签到 ,获得积分0
9秒前
CipherSage应助shencheng采纳,获得10
10秒前
善学以致用应助狂野以松采纳,获得10
10秒前
10秒前
FashionBoy应助科研通管家采纳,获得10
10秒前
ZOE应助科研通管家采纳,获得30
10秒前
星辰大海应助科研通管家采纳,获得10
10秒前
充电宝应助科研通管家采纳,获得10
11秒前
11秒前
liuwenjie应助科研通管家采纳,获得10
11秒前
蓬蒿人发布了新的文献求助10
11秒前
liuwenjie应助科研通管家采纳,获得10
11秒前
香蕉觅云应助科研通管家采纳,获得10
11秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Handbook of pharmaceutical excipients, Ninth edition 5000
Aerospace Standards Index - 2026 ASIN2026 2000
Digital Twins of Advanced Materials Processing 2000
Social Cognition: Understanding People and Events 1200
Polymorphism and polytypism in crystals 1000
Signals, Systems, and Signal Processing 610
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 纳米技术 有机化学 物理 生物化学 化学工程 计算机科学 复合材料 内科学 催化作用 光电子学 物理化学 电极 冶金 遗传学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 6036670
求助须知:如何正确求助?哪些是违规求助? 7755903
关于积分的说明 16215578
捐赠科研通 5182774
什么是DOI,文献DOI怎么找? 2773650
邀请新用户注册赠送积分活动 1756912
关于科研通互助平台的介绍 1641276