统计流形
数学
公制(单位)
歧管(流体力学)
信息几何学
核(代数)
拉普拉斯算子
欧几里得空间
黎曼几何
黎曼流形
多项式分布
高斯分布
热内核
统计模型
应用数学
纯数学
数学分析
几何学
统计
标量曲率
曲率
机械工程
运营管理
物理
量子力学
经济
工程类
作者
John Lafferty,Guy Lebanon
摘要
A family of kernels for statistical learning is introduced that exploits the geometric structure of statistical models. The kernels are based on the heat equation on the Riemannian manifold defined by the Fisher information metric associated with a statistical family, and generalize the Gaussian kernel of Euclidean space. As an important special case, kernels based on the geometry of multinomial families are derived, leading to kernel-based learning algorithms that apply naturally to discrete data. Bounds on covering numbers and Rademacher averages for the kernels are proved using bounds on the eigenvalues of the Laplacian on Riemannian manifolds. Experimental results are presented for document classification, for which the use of multinomial geometry is natural and well motivated, and improvements are obtained over the standard use of Gaussian or linear kernels, which have been the standard for text classification.
科研通智能强力驱动
Strongly Powered by AbleSci AI