计算机科学
相互信息
源代码
人工智能
缩小
领域(数学分析)
数据挖掘
适应(眼睛)
模式识别(心理学)
机器学习
人工神经网络
特征(语言学)
多源
编码(集合论)
接头(建筑物)
联合概率分布
数学
哲学
数学分析
工程类
物理
建筑工程
集合(抽象数据类型)
程序设计语言
光学
操作系统
统计
语言学
作者
Lisheng Wen,Sentao Chen,Mengying Xie,Cheng Liu,Lin Zheng
标识
DOI:10.1016/j.neunet.2023.12.022
摘要
We address the problem of Multi-Source Domain Adaptation (MSDA), which trains a neural network using multiple labeled source datasets and an unlabeled target dataset, and expects the trained network to well classify the unlabeled target data. The main challenge in this problem is that the datasets are generated by relevant but different joint distributions. In this paper, we propose to address this challenge by estimating and minimizing the mutual information in the network latent feature space, which leads to the alignment of the source joint distributions and target joint distribution simultaneously. Here, the estimation of the mutual information is formulated into a convex optimization problem, such that the global optimal solution can be easily found. We conduct experiments on several public datasets, and show that our algorithm statistically outperforms its competitors. Video and code are available at https://github.com/sentaochen/Mutual-Information-Estimation-and-Minimization.
科研通智能强力驱动
Strongly Powered by AbleSci AI