人工智能
计算机科学
模式识别(心理学)
图像处理
典型相关
缩小
相关性
计算机视觉
图像(数学)
数学
几何学
程序设计语言
作者
Sheng Wang,Haishun Du,Ge Zhang,Jianfeng Lu,Jingyu Yang
标识
DOI:10.1117/1.jei.29.2.023001
摘要
Canonical correlation analysis (CCA) is a popular method that has been extensively used in feature learning. In nature, the objective function of CCA is equivalent to minimizing the distance of the paired data, and L2-norm is used as the distance metric. We know that L2-norm-based objective function will emphasize the large distance pairs and de-emphasizes the small distance pairs. To alleviate the aforementioned problems of CCA, we propose an approach named CCA based on L1-norm minimization (CCA-L1) for feature learning. To optimize the objective function, we develop an algorithm that can get a global optimized value. To maintain the distribution and the nonlinear characteristic respectively, we proposed two extensions of CCA-L1. Further, all of the aforementioned three proposed algorithms are extended to deal with multifeature data. The experimental results on an artificial dataset, real-world crop leaf disease dataset, ORL face dataset, and PIE face dataset show that our methods outperform traditional CCA and its variants.
科研通智能强力驱动
Strongly Powered by AbleSci AI