TMBL: Transformer-based multimodal binding learning model for multimodal sentiment analysis

计算机科学 模态(人机交互) 人工智能 模式 卷积神经网络 源代码 特征(语言学) 特征学习 变压器 模式识别(心理学) 机器学习 社会科学 语言学 电压 社会学 哲学 物理 操作系统 量子力学
作者
Jiehui Huang,Jun Zhou,Zhenchao Tang,Jiaying Lin,Calvin Yu‐Chian Chen
出处
期刊:Knowledge Based Systems [Elsevier BV]
卷期号:285: 111346-111346 被引量:16
标识
DOI:10.1016/j.knosys.2023.111346
摘要

Multimodal emotion analysis is an important endeavor in human–computer interaction research, as it enables the accurate identification of an individual's emotional state by simultaneously analyzing text, video, and sound features. Although current emotion recognition algorithms have performed well using multimodal fusion strategies, two key challenges remain. The first challenge is the efficient extraction of modality-invariant and modality-specific features prior to fusion, which requires deep feature interactions between the different modalities. The second challenge concerns the ability to distinguish high-level semantic relations between modality features. To address these issues, we propose a new modality-binding learning framework and redesign the internal structure of the transformer model. Our proposed modality binding learning model addresses the first challenge by incorporating bimodal and trimodal binding mechanisms. These mechanisms handle modality-specific and modality-invariant features, respectively, and facilitate cross-modality interactions. Furthermore, we enhance feature interactions by introducing fine-grained convolution modules in the feedforward and attention layers of the transformer structure. To address the second issue, we introduce CLS and PE feature vectors for modality-invariant and modality-specific features, respectively. We use similarity loss and dissimilarity loss to support model convergence. Experiments on the widely used MOSI and MOSEI datasets show that our proposed method outperforms state-of-the-art multimodal sentiment classification approaches, confirming its effectiveness and superiority. The source code can be found at https://github.com/JackAILab/TMBL.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
劲秉应助涵Allen采纳,获得10
刚刚
xiaoxue发布了新的文献求助10
刚刚
Jasper应助轻松的雪巧采纳,获得10
1秒前
5656发布了新的文献求助10
1秒前
能干的谷蕊完成签到 ,获得积分10
2秒前
量子星尘发布了新的文献求助10
2秒前
2秒前
2秒前
meng完成签到,获得积分10
3秒前
HEIKU应助科研通管家采纳,获得10
3秒前
科目三应助科研通管家采纳,获得10
3秒前
某人完成签到,获得积分20
3秒前
MOMO完成签到 ,获得积分10
3秒前
CipherSage应助科研通管家采纳,获得10
4秒前
科研通AI5应助科研通管家采纳,获得10
4秒前
Lucas应助科研通管家采纳,获得10
4秒前
CipherSage应助科研通管家采纳,获得10
4秒前
科研通AI5应助科研通管家采纳,获得10
4秒前
Ava应助科研通管家采纳,获得10
4秒前
ddd完成签到,获得积分10
4秒前
5秒前
研友_VZG7GZ应助科研通管家采纳,获得10
5秒前
踏实的怜菡完成签到 ,获得积分10
5秒前
iNk应助科研通管家采纳,获得10
5秒前
iNk应助科研通管家采纳,获得10
5秒前
星辰大海应助科研通管家采纳,获得10
5秒前
5秒前
6秒前
科研通AI5应助科研通管家采纳,获得10
6秒前
科研通AI5应助科研通管家采纳,获得10
6秒前
阳生应助科研通管家采纳,获得10
6秒前
Akim应助科研通管家采纳,获得10
6秒前
miaowuuuuuuu完成签到 ,获得积分10
7秒前
7秒前
hrrypeet完成签到,获得积分10
7秒前
7秒前
自信的忆文完成签到,获得积分10
7秒前
lcxll完成签到,获得积分10
9秒前
Vesper发布了新的文献求助10
9秒前
大个应助木木采纳,获得200
9秒前
高分求助中
Production Logging: Theoretical and Interpretive Elements 2700
Neuromuscular and Electrodiagnostic Medicine Board Review 1000
Statistical Methods for the Social Sciences, Global Edition, 6th edition 600
こんなに痛いのにどうして「なんでもない」と医者にいわれてしまうのでしょうか 510
The First Nuclear Era: The Life and Times of a Technological Fixer 500
ALUMINUM STANDARDS AND DATA 500
Walter Gilbert: Selected Works 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3667043
求助须知:如何正确求助?哪些是违规求助? 3225810
关于积分的说明 9765818
捐赠科研通 2935662
什么是DOI,文献DOI怎么找? 1607850
邀请新用户注册赠送积分活动 759374
科研通“疑难数据库(出版商)”最低求助积分说明 735322