Lasso(编程语言)
Scad公司
独立性(概率论)
陈
特征选择
维数(图论)
样本量测定
数学
回归分析
回归
变量(数学)
计算机科学
统计
应用数学
算法
人工智能
组合数学
精神科
心肌梗塞
古生物学
万维网
数学分析
生物
心理学
标识
DOI:10.1198/jasa.2008.tm08516
摘要
Motivated by the seminal theory of Sure Independence Screening (Fan and Lv 2008, SIS), we investigate here another popular and classical variable screening method, namely, forward regression (FR). Our theoretical analysis reveals that FR can identify all relevant predictors consistently, even if the predictor dimension is substantially larger than the sample size. In particular, if the dimension of the true model is finite, FR can discover all relevant predictors within a finite number of steps. To practically select the “best” candidate from the models generated by FR, the recently proposed BIC criterion of Chen and Chen (2008) can be used. The resulting model can then serve as an excellent starting point, from where many existing variable selection methods (e.g., SCAD and Adaptive LASSO) can be applied directly. FR’s outstanding finite sample performances are confirmed by extensive numerical studies.
科研通智能强力驱动
Strongly Powered by AbleSci AI