核希尔伯特再生空间
分布的核嵌入
核方法
核(代数)
计算机科学
树核
人工智能
机器学习
希尔伯特空间
二元分类
径向基函数核
再中心定理
领域(数学分析)
航程(航空)
多项式核
数学
支持向量机
离散数学
数学分析
复合材料
材料科学
作者
Thomas Hofmann,Bernhard Schölkopf,Alexander J. Smola
标识
DOI:10.1214/009053607000000677
摘要
We review machine learning methods employing positive definite kernels. These methods formulate learning and estimation problems in a reproducing kernel Hilbert space (RKHS) of functions defined on the data domain, expanded in terms of a kernel. Working in linear spaces of function has the benefit of facilitating the construction and analysis of learning algorithms while at the same time allowing large classes of functions. The latter include nonlinear functions as well as functions defined on nonvectorial data. We cover a wide range of methods, ranging from binary classifiers to sophisticated methods for estimation with structured data.
科研通智能强力驱动
Strongly Powered by AbleSci AI