遗忘
人工神经网络
一般化
人工智能
计算机科学
梯度下降
机器学习
算法
数学
数学分析
哲学
语言学
作者
Yiming Fei,Jiangang Li,Yanan Li
标识
DOI:10.1109/tnnls.2024.3385407
摘要
In radial basis function neural network (RBFNN)-based real-time learning tasks, forgetting mechanisms are widely used such that the neural network can keep its sensitivity to new data. However, with forgetting mechanisms, some useful knowledge will get lost simply because they are learned a long time ago, which we refer to as the passive knowledge forgetting phenomenon. To address this problem, this article proposes a real-time training method named selective memory recursive least squares (SMRLS) in which the classical forgetting mechanisms are recast into a memory mechanism. Different from the forgetting mechanism, which mainly evaluates the importance of samples according to the time when samples are collected, the memory mechanism evaluates the importance of samples through both temporal and spatial distribution of samples. With SMRLS, the input space of the RBFNN is evenly divided into a finite number of partitions, and a synthesized objective function is developed using synthesized samples from each partition. In addition to the current approximation error, the neural network also updates its weights according to the recorded data from the partition being visited. Compared with classical training methods including the forgetting factor recursive least squares (FFRLS) and stochastic gradient descent (SGD) methods, SMRLS achieves improved learning speed and generalization capability, which are demonstrated by corresponding simulation results.
科研通智能强力驱动
Strongly Powered by AbleSci AI