数学
梯度下降
可逆矩阵
趋同(经济学)
算法
下降方向
应用数学
可微函数
奇点
人工神经网络
区间(图论)
序列(生物学)
数学优化
计算机科学
数学分析
纯数学
人工智能
组合数学
生物
经济
遗传学
经济增长
作者
Honggui Han,Chenxuan Sun,Xiaolong Wu,Hongyan Yang,Junfei Qiao
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2022-12-07
卷期号:35 (6): 8176-8189
被引量:1
标识
DOI:10.1109/tnnls.2022.3225181
摘要
Interval type-2 fuzzy neural network (IT2FNN) is widely used to model nonlinear systems. Unfortunately, the gradient descent-based IT2FNN with uncertain variances always suffers from low convergence speed due to its inherent singularity. To cope with this problem, a nonsingular gradient descent algorithm (NSGDA) is developed to update IT2FNN in this article. First, the widths of type-2 fuzzy rules are transformed into root inverse variances (RIVs) that always satisfy the sufficient condition of differentiability. Second, the singular RIVs are reformulated by the nonsingular Shapley-based matrices associated with type-2 fuzzy rules. It averts the convergence stagnation caused by zero derivatives of singular RIVs, thereby sustaining the gradient convergence. Third, an integrated-form update strategy (IUS) is designed to obtain the derivatives of parameters, including RIVs, centers, weight coefficients, deviations, and proportionality coefficient of IT2FNN. These parameters are packed into multiple subvariable matrices, which are capable to accelerate gradient convergence using parallel calculation instead of sequence iteration. Finally, the experiments showcase that the proposed NSGDA-based IT2FNN can improve the convergence speed through the improved learning algorithm.
科研通智能强力驱动
Strongly Powered by AbleSci AI