Learning multi-scale features for speech emotion recognition with connection attention mechanism

计算机科学 话语 特征(语言学) 特征学习 人工智能 语音识别 模式识别(心理学) 代表(政治) 帧(网络) 卷积神经网络 情绪分类 光学(聚焦) 特征提取 融合机制 融合 哲学 物理 电信 光学 法学 脂质双层融合 语言学 政治 政治学
作者
Zengzhao Chen,Jiawen Li,Hai Liu,Xuyang Wang,Wang Hu,Qiuyu Zheng
出处
期刊:Expert Systems With Applications [Elsevier]
卷期号:214: 118943-118943 被引量:30
标识
DOI:10.1016/j.eswa.2022.118943
摘要

Speech emotion recognition (SER) has become a crucial topic in the field of human–computer interactions. Feature representation plays an important role in SER, but there are still many challenges in feature representation such as the inability to predict which features are most effective for SER and the cultural differences in emotion expression. Most previous studies use a single type of feature for the recognition task or conduct early fusion of features. However, a single type of feature cannot well reflect the emotions of speech signals. Also, different features contain different information, direct fusion cannot integrate the advantages of different features. To overcome these challenges, this paper proposes a parallel network for multi-scale SER based on a connection attention mechanism (AMSNet). AMSNet fuses fine-grained frame-level manual features with coarse-grained utterance-level deep features. Meanwhile, it adopts different speech emotion feature extraction modules according to the temporal and spatial features of speech signals, which enriches features and improves feature characterization. The network consists of a frame-level representation learning module (FRLM) based on the time structure and an utterance-level representation learning module (URLM) based on the global structure. Besides, improved attention-based long short-term memory (LSTM) is introduced into FRLM to focus on the frames that contribute more to the final emotion recognition result. In URLM, a convolutional neural network with the squeeze-and-excitation block (SCNN) is introduced to extract deep features. In addition, the connection attention mechanism is proposed for feature fusion, which applies different weights to different features. Extensive experiments are conducted on the IEMOCAP and EmoDB datasets, and the results demonstrate the effectiveness and performance superiority of AMSNet. Our code will be publicly available at https://codeocean.com/capsule/8636967/tree/v1.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
Sarah完成签到,获得积分10
1秒前
JK157发布了新的文献求助10
2秒前
香蕉觅云应助zhuhan采纳,获得10
2秒前
3秒前
汉堡包应助vv采纳,获得10
4秒前
窗外的你发布了新的文献求助10
4秒前
没有银发布了新的文献求助10
4秒前
4秒前
LuLan0401完成签到,获得积分10
4秒前
wuhen完成签到,获得积分10
5秒前
情怀应助ff采纳,获得10
5秒前
Joyful发布了新的文献求助10
5秒前
小松鼠发布了新的文献求助10
5秒前
初之完成签到,获得积分10
5秒前
小刘发布了新的文献求助10
6秒前
Ansong完成签到,获得积分10
6秒前
雷璐关注了科研通微信公众号
7秒前
7秒前
7秒前
bioboy发布了新的文献求助10
8秒前
8秒前
gxm发布了新的文献求助100
9秒前
9秒前
左彦完成签到,获得积分10
9秒前
朴实天寿应助香瓜采纳,获得20
10秒前
踏实若云完成签到,获得积分10
10秒前
英姑应助奋斗的幻桃采纳,获得10
10秒前
11秒前
隐形的长颈鹿完成签到,获得积分10
12秒前
合适的瑛关注了科研通微信公众号
12秒前
Xing发布了新的文献求助10
13秒前
CipherSage应助研友_5476B5采纳,获得10
13秒前
坦率耳机应助SY采纳,获得10
13秒前
Owen应助我要向阳而生采纳,获得10
13秒前
蜗牛撵大象完成签到,获得积分10
13秒前
机智的凡完成签到,获得积分20
14秒前
14秒前
Psy完成签到,获得积分10
14秒前
斯文败类应助双双采纳,获得10
14秒前
高分求助中
Evolution 10000
Sustainability in Tides Chemistry 2800
юрские динозавры восточного забайкалья 800
English Wealden Fossils 700
An Introduction to Geographical and Urban Economics: A Spiky World Book by Charles van Marrewijk, Harry Garretsen, and Steven Brakman 500
Diagnostic immunohistochemistry : theranostic and genomic applications 6th Edition 500
Chen Hansheng: China’s Last Romantic Revolutionary 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3151350
求助须知:如何正确求助?哪些是违规求助? 2802831
关于积分的说明 7850478
捐赠科研通 2460184
什么是DOI,文献DOI怎么找? 1309602
科研通“疑难数据库(出版商)”最低求助积分说明 628992
版权声明 601760