泰勒级数
有理函数
黑匣子
人工神经网络
多项式的
计算机科学
多项式展开
推论
趋同(经济学)
功能(生物学)
人工智能
口译(哲学)
算法
理论计算机科学
数学
纯数学
生物
进化生物学
经济增长
数学分析
经济
程序设计语言
作者
Tingxiong Xiao,Weihang Zhang,Yuxiao Cheng,Jinli Suo
标识
DOI:10.1109/tpami.2024.3399197
摘要
Despite their remarkable performance, deep neural networks remain mostly "black boxes", suggesting inexplicability and hindering their wide applications in fields requiring making rational decisions. Here we introduce HOPE (High-order Polynomial Expansion), a method for expanding a network into a high-order Taylor polynomial on a reference input. Specifically, we derive the high-order derivative rule for composite functions and extend the rule to neural networks to obtain their high-order derivatives quickly and accurately. From these derivatives, we can then derive the Taylor polynomial of the neural network, which provides an explicit expression of the network's local interpretations. We combine the Taylor polynomials obtained under different reference inputs to obtain the global interpretation of the neural network. Numerical analysis confirms the high accuracy, low computational complexity, and good convergence of the proposed method. Moreover, we demonstrate HOPE's wide applications built on deep learning, including function discovery, fast inference, and feature selection. We compared HOPE with other XAI methods and demonstrated our advantages. The code is available at https://github.com/HarryPotterXTX/HOPE.git .
科研通智能强力驱动
Strongly Powered by AbleSci AI