Robust principal component analysis (PCA) for feature extraction, which takes various robust norms as the distance metric, has been shown to be valid in data reconstruction and recognition tasks. However, some existing approaches do not achieve satisfactory performance and robustness owing to the imperfection of norms used for measuring the distances in models. To solve this problem, we propose a new formulation of the robust PCA feature extraction method, called flexible capped PCA (FCPCA), in which we use capped L 2,p -norm distance metric to minimize reconstruction errors of data. In addition to having rotational invariance , capped L 2,p -norm distance metric can better weaken the role of outliers, even if the outliers’ effect is magnified when p is a large value . This characteristic makes our model more powerful than existing approaches. Also, the proposed model is more flexible than many other robust models, which can be derived by specifying the value of p. However, solving the resulting objective problem is a challenge. One of the key contributions of this study is the design of a new iterative algorithm for the derivation of an optimal solution, in which each iteration computes an eigenvalue problem associated with a weighted covariance matrix . Experimental results on several public datasets prove the algorithm’s effectiveness.