一般化
计算机科学
领域(数学分析)
水准点(测量)
人工智能
任务(项目管理)
机器学习
骨料(复合)
模式识别(心理学)
数学
数学分析
材料科学
管理
大地测量学
经济
复合材料
地理
作者
Mengzhu Wang,Jianlong Yuan,Zhibin Wang
标识
DOI:10.1145/3581783.3611871
摘要
Domain generalization (DG) refers to the task of training a model on multiple source domains and test it on a different target domain with different distribution. In this paper, we address a more challenging and realistic scenario known as Single Long-Tailed Domain Generalization, where only one source domain is available and the minority class in this domain has an abundance of instances in other domains. To tackle this task, we propose a novel approach called Mixture-of-Experts Learner for Single Long-Tailed Domain Generalization (MoEL), which comprises two key strategies. The first strategy is a simple yet effective data augmentation technique that leverages saliency maps to identify important regions on the original images and preserves these regions during augmentation. The second strategy is a new skill-diverse expert learning approach that trains multiple experts from a single long-tailed source domain and leverages mutual learning to aggregate their learned knowledge for the unknown target domain. We evaluate our method on various benchmark datasets, including Digits-DG, CIFAR-10-C, PACS, and DomainNet, and demonstrate its superior performance compared to previous single domain generalization methods. Additionally, the ablation study is also conducted to illustrate the inner workings of our approach.
科研通智能强力驱动
Strongly Powered by AbleSci AI