正规化(语言学)
Lasso(编程语言)
期限(时间)
相关性
算法
融合
计算机科学
过程(计算)
人工神经网络
钥匙(锁)
数学
人工智能
数学优化
语言学
哲学
物理
几何学
计算机安全
量子力学
万维网
操作系统
作者
Fei Chu,Tao Liang,C. L. Philip Chen,Xuesong Wang,Xiaoping Ma
出处
期刊:IEEE transactions on cybernetics
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:54 (1): 435-448
被引量:7
标识
DOI:10.1109/tcyb.2023.3267947
摘要
Aiming at simplifying the network structure of broad learning system (BLS), this article proposes a novel simplification method called compact BLS (CBLS). Groups of nodes play an important role in the modeling process of BLS, and it means that there may be a correlation between nodes. The proposed CBLS not only focuses on the compactness of network structure but also pays closer attention to the correlation between nodes. Learning from the idea of Fused Lasso and Smooth Lasso, it uses the L1 -regularization term and the fusion term to penalize each output weight and the difference between adjacent output weights, respectively. The L1 -regularization term determines the correlation between the nodes and the outputs, whereas the fusion term captures the correlation between nodes. By optimizing the output weights iteratively, the correlation between the nodes and the outputs and the correlation between nodes are attempted to be considered in the simplification process simultaneously. Without reducing the prediction accuracy, finally, the network structure is simplified more reasonably and a sparse and smooth output weights solution is provided, which can reflect the characteristic of group learning of BLS. Furthermore, according to the fusion terms used in Fused Lasso and Smooth Lasso, two different simplification strategies are developed and compared. Multiple experiments based on public datasets are used to demonstrate the feasibility and effectiveness of the proposed methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI