极限学习机
一般化
计算机科学
结构风险最小化
离群值
机器学习
人工智能
缩小
泛化误差
经验风险最小化
异方差
支持向量机
算法
人工神经网络
数学
数学分析
程序设计语言
作者
Wanyu Deng,Qinghua Zheng,Chen Lin
标识
DOI:10.1109/cidm.2009.4938676
摘要
Extreme learning machine proposed by Huang G-B has attracted many attentions for its extremely fast training speed and good generalization performance. But it still can be considered as empirical risk minimization theme and tends to generate over-fitting model. Additionally, since ELM doesn't considering heteroskedasticity in real applications, its performance will be affected seriously when outliers exist in the dataset. In order to address these drawbacks, we propose a novel algorithm called regularized extreme learning machine based on structural risk minimization principle and weighted least square. The generalization performance of the proposed algorithm was improved significantly in most cases without increasing training time.
科研通智能强力驱动
Strongly Powered by AbleSci AI