域适应
子空间拓扑
学习迁移
计算机科学
无监督学习
适应(眼睛)
人工智能
凸组合
正多边形
领域(数学分析)
凸优化
数学
心理学
分类器(UML)
数学分析
神经科学
几何学
作者
Zhipeng Lin,Wenjing Yang,Tingjin Luo,Wenjing Yang,Yongjun Zhang,Yuhua Tang
标识
DOI:10.1109/icme.2019.00254
摘要
Transfer subspace learning aims to learn robust subspace for the target domain by leveraging knowledge from the source domain. The traditional methods often adopt the convex norm to approximate the original sparse and low-rank constraints, which make the optimization problem be easily solved. However, such relax approximation leads to the performance deviation of the original non-convex model. In this paper, we propose a novel Non-convex Transfer Subspace Learning~(NTSL) method to provide a tighter approximation to the original sparse and low-rank constraints. Specifically, we design an objective function that leverages the Schatten p-norm and ℓ_2, p-norm to preserve the structure between the source and target domains. With Schatten p-norm, the objective function better approximates the rank minimization problem than the nuclear norm and preserves the structure of domains. Besides, the ℓ_2, p-norm can reduce the effect of noise and improve the robustness to outliers. Meanwhile, we develop an efficient algorithm to solve the non-convex minimization problem. Extensive experimental results on cross-domain tasks show the effectiveness of our proposed method.
科研通智能强力驱动
Strongly Powered by AbleSci AI