Disentanglement Translation Network for multimodal sentiment analysis

计算机科学 冗余(工程) 编码器 判别式 人工智能 模式 特征学习 模态(人机交互) 机器学习 社会科学 操作系统 社会学
作者
Ying Zeng,Wenjun Yan,Sijie Mai,Haifeng Hu
出处
期刊:Information Fusion [Elsevier]
卷期号:102: 102031-102031 被引量:42
标识
DOI:10.1016/j.inffus.2023.102031
摘要

Obtaining an effective joint representation has always been the goal for multimodal tasks. However, distributional gap inevitably exists due to the heterogeneous nature of different modalities, which poses burden on the fusion process and the learning of multimodal representation. The imbalance of modality dominance further aggravates this problem, where inferior modalities may contain much redundancy that introduces additional variations. To address the aforementioned issues, we propose a Disentanglement Translation Network (DTN) with Slack Reconstruction to capture desirable information properties, obtain a unified feature distribution and reduce redundancy. Specifically, the encoder–decoder-based disentanglement framework is adopted to decouple the unimodal representations into modality-common and modality-specific subspaces, which explores the cross-modal commonality and diversity, respectively. In the encoding stage, to narrow down the discrepancy, a two-stage translation is devised to incorporate with the disentanglement learning framework. The first stage targets at learning modality-invariant embedding for modality-common information with adversarial learning strategy, capturing the commonality shared across modalities. The second stage considers the modality-specific information that reveals diversity. To relieve the burden of multimodal fusion, we realize Specific-Common Distribution Matching to further unify the distribution of the desirable information. As for the decoding and reconstruction stage, we propose Slack Reconstruction to seek a balance between retaining discriminative information and reducing redundancy. Although the existing commonly-used reconstruction loss with strict constraint lowers the risk of information loss, it easily leads to the preservation of information redundancy. In contrast, Slack Reconstruction imposes a more relaxed constraint so that the redundancy is not forced to be retained, and simultaneously explores the inter-sample relationships. The proposed method aids multimodal fusion by learning the exact properties and obtaining a more uniform distribution for cross-modal data, and manages to reduce information redundancy to further ensure feature effectiveness. Extensive experiments on the task of multimodal sentiment analysis indicate the effectiveness of the proposed method. The codes are available at https://github.com/zengy268/DTN.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
是真的完成签到 ,获得积分10
刚刚
1秒前
2秒前
DONG发布了新的文献求助10
2秒前
Cooper应助fafafa采纳,获得10
4秒前
Nuyoah完成签到 ,获得积分10
5秒前
婵羽完成签到,获得积分10
5秒前
6秒前
liu发布了新的文献求助10
6秒前
orixero应助zx采纳,获得10
7秒前
元元完成签到,获得积分10
9秒前
9秒前
biocreater完成签到,获得积分10
9秒前
量子星尘发布了新的文献求助10
10秒前
葳蕤完成签到 ,获得积分10
10秒前
10秒前
FashionBoy应助徐徐俊采纳,获得10
12秒前
xw完成签到,获得积分20
14秒前
15秒前
car完成签到 ,获得积分10
15秒前
123完成签到,获得积分20
15秒前
17秒前
17秒前
哈哈2022完成签到,获得积分10
17秒前
amanda发布了新的文献求助10
20秒前
浅浅依云完成签到,获得积分10
20秒前
领导范儿应助LZR采纳,获得10
21秒前
李健应助凡凡采纳,获得10
21秒前
21秒前
123发布了新的文献求助10
22秒前
23秒前
量子星尘发布了新的文献求助10
23秒前
龙成阳完成签到 ,获得积分10
24秒前
笑点低豆芽完成签到,获得积分10
24秒前
24秒前
xw发布了新的文献求助10
25秒前
26秒前
简单刺猬完成签到,获得积分10
26秒前
26秒前
26秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Introduction to strong mixing conditions volume 1-3 5000
Ägyptische Geschichte der 21.–30. Dynastie 2500
Human Embryology and Developmental Biology 7th Edition 2000
The Developing Human: Clinically Oriented Embryology 12th Edition 2000
Clinical Microbiology Procedures Handbook, Multi-Volume, 5th Edition 2000
„Semitische Wissenschaften“? 1510
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5741989
求助须知:如何正确求助?哪些是违规求助? 5404909
关于积分的说明 15343645
捐赠科研通 4883431
什么是DOI,文献DOI怎么找? 2625021
邀请新用户注册赠送积分活动 1573893
关于科研通互助平台的介绍 1530838