计算机科学
财务困境
人工智能
财务
业务
金融体系
作者
Zijiao Zhang,Chong Wu,Shiyou Qu,Xiaofang Chen
标识
DOI:10.1016/j.ipm.2022.102988
摘要
External stakeholders require accurate and explainable financial distress prediction (FDP) models. Complex machine learning algorithms offer high accuracy, but most of them lack explanatory power, resulting in external stakeholders being cautious in adopting them. Therefore, an explainable artificial intelligence approach including a whole process ensemble method and an explainable frame for FDP is here proposed. The ensemble algorithm from feature selection to predictor construction can achieve high accuracy according to the actual case, and the interpretation framework can meet the needs of external users by generating local explanations and global explanations. First, a two-stage scheme integrated with a filter and wrapper technique is designed for feature selection. Second, multiple ensemble models are explored and they are evaluated according to the actual case. Finally, Shapley additive explanations, counterfactual explanations and partial dependence plots are employed to enhance model interpretability. Taking financial data of Chinese listed companies from 2007 to 2020 as a dataset, the highest AUC is ensured by LightGBM with a value of 0.92. Local explanations help individual enterprises identify the key features which lead to their financial distress, and counterfactual explanations are produced to provide improvement strategies. By analyzing the features importance and the impact of feature interaction on the results, global explanations can improve the transparency and credibility of ‘black box’ models.
科研通智能强力驱动
Strongly Powered by AbleSci AI