马氏距离
计算机科学
后悔
公制(单位)
Kullback-Leibler散度
数学优化
背景(考古学)
人工智能
算法
数学
机器学习
运营管理
生物
古生物学
经济
作者
Jason V. Davis,Brian Kulis,Prateek Jain,Suvrit Sra,Inderjit S. Dhillon
标识
DOI:10.1145/1273496.1273523
摘要
In this paper, we present an information-theoretic approach to learning a Mahalanobis distance function. We formulate the problem as that of minimizing the differential relative entropy between two multivariate Gaussians under constraints on the distance function. We express this problem as a particular Bregman optimization problem---that of minimizing the LogDet divergence subject to linear constraints. Our resulting algorithm has several advantages over existing methods. First, our method can handle a wide variety of constraints and can optionally incorporate a prior on the distance function. Second, it is fast and scalable. Unlike most existing methods, no eigenvalue computations or semi-definite programming are required. We also present an online version and derive regret bounds for the resulting algorithm. Finally, we evaluate our method on a recent error reporting system for software called Clarify, in the context of metric learning for nearest neighbor classification, as well as on standard data sets.
科研通智能强力驱动
Strongly Powered by AbleSci AI