特征选择
一致性(知识库)
变量(数学)
参数统计
协变量
计算机科学
非参数统计
选择(遗传算法)
功能(生物学)
工具变量
回归分析
线性回归
选型
数学
人工智能
数学优化
机器学习
统计
数学分析
进化生物学
生物
作者
Hong Chen,Youcheng Fu,Xue Jiang,Yanhong Chen,Weifu Li,Yicong Zhou,Feng Zheng
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2023-01-18
卷期号:35 (7): 9686-9699
标识
DOI:10.1109/tnnls.2023.3236345
摘要
Variable selection methods aim to select the key covariates related to the response variable for learning problems with high-dimensional data. Typical methods of variable selection are formulated in terms of sparse mean regression with a parametric hypothesis class, such as linear functions or additive functions. Despite rapid progress, the existing methods depend heavily on the chosen parametric function class and are incapable of handling variable selection for problems where the data noise is heavy-tailed or skewed. To circumvent these drawbacks, we propose sparse gradient learning with the mode-induced loss (SGLML) for robust model-free (MF) variable selection. The theoretical analysis is established for SGLML on the upper bound of excess risk and the consistency of variable selection, which guarantees its ability for gradient estimation from the lens of gradient risk and informative variable identification under mild conditions. Experimental analysis on the simulated and real data demonstrates the competitive performance of our method over the previous gradient learning (GL) methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI