MNIST数据库
计算机科学
变压器
算法
人工智能
机器学习
模式识别(心理学)
人工神经网络
电压
物理
量子力学
作者
Rongrong Fu,Haifeng Liang,Shiwei Wang,Chengcheng Jia,Guangbin Sun,Tengfei Gao,Dan Chen,Yaodong Wang
标识
DOI:10.1016/j.eswa.2023.121734
摘要
Due to its efficient model calibration given by unique incremental learning capability, broad learning system (BLS) has made impressive progress in image analytical tasks such as image classification and object detection. Inspired by this incremental remodel success, we proposed a novel transformer-BLS network to achieve a trade-off between model training speed and accuracy. Specially, we developed sub-BLS layers with the multi-head attention mechanism and combining these layers to construct a transformer-BLS network. In particular, our proposed transformer-BLS network provides four different incremental learning algorithms that enable the proposed model can realize the increments of its feature nodes, enhancement nodes, input data and sub-BLS layers, respectively, without the need of the full-weight update in this model. Furthermore, we validated the performance of our transformer-BLS network and its four incremental learning algorithms on a variety of image classification datasets. The results demonstrated that the proposed transformer-BLS maintains classification performance on both the MNIST and Fashion-MNIST datasets, while saving 2/3 of the training time. These findings imply that the proposed method has the potential in significant reducing model training complexity with this incremental remodel system, while simultaneously improving the increment learning performance of the original BLS within such contexts, especially in the classification task of some datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI