计算机科学
支持向量机
数据流挖掘
维数之咒
二次规划
一般化
回归
噪音(视频)
回归分析
算法
时间序列
数据挖掘
机器学习
人工智能
数学优化
数学
统计
图像(数学)
数学分析
作者
Hang Yu,Jie Lü,Guangquan Zhang
出处
期刊:IEEE Transactions on Knowledge and Data Engineering
[Institute of Electrical and Electronics Engineers]
日期:2020-01-01
卷期号:: 1-1
被引量:39
标识
DOI:10.1109/tkde.2020.2979967
摘要
Since support vector regression (SVR) is a flexible regression algorithm, its computational complexity does not depend on the dimensionality of the input space, and it has excellent generalization capability. However, a central assumption with SVRs is that all the required data is available at the time of construction, which means these algorithms cannot be used with data streams. Incremental SVR has been offered as a potential solution, but its accuracy suffers with noise and learning speeds are slow. To overcome these two limitations, we propose a novel incremental regression algorithm, called online robust support vector regression (ORSVR). ORSVR solves nonparallel bound functions simultaneously. Hence, the large quadratic programming problem (QPP) in classical v-SVR are decomposed into two smaller QPPs. An incremental learning algorithm then solves each QPP step-by-step. The results of a series of comparative experiments demonstrate that the ORSVR algorithm efficiently solves regression problems in data streams, with or without noise, and speeds up the learning process.
科研通智能强力驱动
Strongly Powered by AbleSci AI