数学
互惠的
有界函数
可微函数
指数稳定性
正多边形
理论(学习稳定性)
凸组合
应用数学
人工神经网络
凸优化
数学分析
计算机科学
非线性系统
量子力学
几何学
机器学习
物理
哲学
语言学
作者
Shenquan Wang,Wenchengyu Ji,Yulian Jiang,Derong Liu
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2019-12-17
卷期号:31 (10): 4157-4169
被引量:30
标识
DOI:10.1109/tnnls.2019.2952410
摘要
This article investigates global asymptotic stability for neural networks (NNs) with time-varying delay, which is differentiable and uniformly bounded, and the delay derivative exists and is upper-bounded. First, we propose the extended secondary delay partitioning technique to construct the novel Lyapunov-Krasovskii functional, where both single-integral and double-integral state variables are considered, while the single-integral ones are only solved by the traditional secondary delay partitioning. Second, a novel free-weight matrix equality (FWME) is presented to resolve the reciprocal convex combination problem equivalently and directly without Schur complement, which eliminates the need of positive definite matrices, and is less conservative and restrictive compared with various improved reciprocal convex inequalities. Furthermore, by the present extended secondary delay partitioning, equivalent reciprocal convex combination technique, and Bessel-Legendre inequality, two different relaxed sufficient conditions ensuring global asymptotic stability for NNs are obtained, for time-varying delays, respectively, with unknown and known lower bounds of the delay derivative. Finally, two examples are given to illustrate the superiority and effectiveness of the presented method.
科研通智能强力驱动
Strongly Powered by AbleSci AI