一般化
支持向量机
航程(航空)
核(代数)
径向基函数
径向基函数核
人工智能
高斯函数
多项式核
高斯分布
数学
模式识别(心理学)
统计学习理论
基础(线性代数)
核方法
人工神经网络
应用数学
计算机科学
多项式的
机器学习
离散数学
数学分析
物理
材料科学
几何学
复合材料
量子力学
作者
Wenjian Wang,Zongben Xu,Jane W. Z. Lu,Xiaoyun Zhang
出处
期刊:Neurocomputing
[Elsevier]
日期:2003-10-01
卷期号:55 (3-4): 643-663
被引量:375
标识
DOI:10.1016/s0925-2312(02)00632-x
摘要
Based on statistical learning theory, Support Vector Machine (SVM) is a novel type of learning machine, and it contains polynomial, neural network and radial basis function (RBF) as special cases. In the RBF case, the Gaussian kernel is commonly used, while the spread parameter σ in the Gaussian kernel is essential to generalization performance of SVMs. In this paper, determination of σ is studied based on discussions of the influence of σ on generalization performance. For classification problems, the optimal σ can be computed on the basis of Fisher discrimination. And for regression problems, based on scale space theory, we demonstrate the existence of a certain range of σ, within which the generalization performance is stable. An appropriate σ within the range can be achieved via dynamic evaluation. In addition, the lower bound of iterating step size of σ is given. Simulation results show the effectiveness of the presented method.
科研通智能强力驱动
Strongly Powered by AbleSci AI