学习迁移
计算机科学
特征学习
人工智能
Softmax函数
自编码
半监督学习
冗余(工程)
机器学习
深度学习
编码器
正规化(语言学)
歧管对齐
模式识别(心理学)
多任务学习
非线性降维
降维
经济
管理
操作系统
任务(项目管理)
作者
Yi Zhu,Xindong Wu,Peipei Li,Yuhong Zhang,Xuegang Hu
标识
DOI:10.1016/j.neucom.2019.08.078
摘要
The excellent performance of transfer learning has emerged in the past few years. How to find feature representations which minimize the distance between source and target domains is a crucial problem in transfer learning. Recently, deep learning methods have been proposed to learn higher level and robust representations. However, in traditional methods, label information in source domain is not designed to optimize both feature representations and parameters of the learning model. Additionally, the redundancy of data may incur performance degradation on transfer learning. To address these problems, we propose a novel semi-supervised representation deep learning framework for transfer learning. To obtain this framework, manifold regularization is integrated for the parameter optimization, and the label information is encoded using a softmax regression model in auto-encoders. Meanwhile, whitening layer is introduced to reduce the redundancy of data before auto-encoders. Extensive experiments demonstrate the effectiveness of our proposed framework compared to other competing state-of-the-art baseline methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI