公制(单位)
转化(遗传学)
代表(政治)
基质(化学分析)
特征向量
相似性(几何)
变换矩阵
计算机科学
人工智能
距离矩阵
矩阵相似性
矩阵表示法
单位矩阵
数学
模式识别(心理学)
算法
图像(数学)
群(周期表)
数学分析
政治学
物理
运营管理
运动学
法学
材料科学
化学
有机化学
生物化学
量子力学
经济
复合材料
经典力学
偏微分方程
政治
基因
作者
Luping Zhou,Lei Wang,Jianjia Zhang,Yinghuan Shi,Yang Gao
标识
DOI:10.1109/cvpr.2017.752
摘要
The success of many visual recognition tasks largely depends on a good similarity measure, and distance metric learning plays an important role in this regard. Meanwhile, Symmetric Positive Definite (SPD) matrix is receiving increased attention for feature representation in multiple computer vision applications. However, distance metric learning on SPD matrices has not been sufficiently researched. A few existing works approached this by learning either d 2 × p or d × k transformation matrix for d× d SPD matrices. Different from these methods, this paper proposes a new member to the family of distance metric learning for SPD matrices. It learns only d parameters to adjust the eigenvalues of the SPD matrices through an efficient optimisation scheme. Also, it is shown that the proposed method can be interpreted as learning a sample-specific transformation matrix, instead of the fixed transformation matrix learned for all the samples in the existing works. The optimised d parameters can be used to massage the SPD matrices for better discrimination while still keeping them in the original space. From this perspective, the proposed method complements, rather than competes with, the existing linear-transformation-based methods, as the latter can always be applied to the output of the former to perform distance metric learning in further. The proposed method has been tested on multiple SPD-based visual representation data sets used in the literature, and the results demonstrate its interesting properties and attractive performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI