秩(图论)
主成分分析
数学
先验与后验
最小二乘函数近似
因子分析
应用数学
统计
数学优化
组合数学
哲学
认识论
估计员
出处
期刊:Cornell University - arXiv
日期:2017-08-27
被引量:11
摘要
It is known that the common factors in a large panel of data can be consistently estimated by the method of principal components, and principal components can be constructed by iterative least squares regressions. Replacing least squares with ridge regressions turns out to have the effect of shrinking the singular values of the common component and possibly reducing its rank. The method is used in the machine learning literature to recover low-rank matrices. We study the procedure from the perspective of estimating a minimum-rank approximate factor model. We show that the constrained factor estimates are biased but can be more efficient in terms of mean-squared errors. Rank consideration suggests a data-dependent penalty for selecting the number of factors. The new criterion is more conservative in cases when the nominal number of factors is inflated by the presence of weak factors or large measurement noise. The framework is extended to incorporate a priori linear constraints on the loadings. We provide asymptotic results that can be used to test economic hypotheses.
科研通智能强力驱动
Strongly Powered by AbleSci AI