学习迁移
计算机科学
域适应
适应(眼睛)
领域(数学分析)
机器学习
人工智能
简单(哲学)
训练集
传输(计算)
数学
数学分析
哲学
物理
认识论
并行计算
分类器(UML)
光学
作者
Jiquan Ngiam,Daiyi Peng,Vijay Vasudevan,Simon Kornblith,Quoc V. Le,Ruoming Pang
出处
期刊:Cornell University - arXiv
日期:2018-01-01
被引量:91
标识
DOI:10.48550/arxiv.1811.07056
摘要
Transfer learning is a widely used method to build high performing computer vision models. In this paper, we study the efficacy of transfer learning by examining how the choice of data impacts performance. We find that more pre-training data does not always help, and transfer performance depends on a judicious choice of pre-training data. These findings are important given the continued increase in dataset sizes. We further propose domain adaptive transfer learning, a simple and effective pre-training method using importance weights computed based on the target dataset. Our method to compute importance weights follow from ideas in domain adaptation, and we show a novel application to transfer learning. Our methods achieve state-of-the-art results on multiple fine-grained classification datasets and are well-suited for use in practice.
科研通智能强力驱动
Strongly Powered by AbleSci AI