阈值
初始化
计算机科学
人工神经网络
对偶(序理论)
凸性
对偶(语法数字)
趋同(经济学)
人工智能
卷积神经网络
算法
数学优化
数学
离散数学
图像(数学)
艺术
文学类
经济
程序设计语言
金融经济学
经济增长
作者
Chunyan Xiong,Chaoxing Zhang,Mengli Lu,Xin Yu,Jian Cao,Zhong Chen,Di Guo,Xiaobo Qu
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:: 1-13
标识
DOI:10.1109/tnnls.2024.3353795
摘要
Soft-thresholding has been widely used in neural networks. Its basic network structure is a two-layer convolution neural network with soft-thresholding. Due to the network’s nature of nonlinear and nonconvex, the training process heavily depends on an appropriate initialization of network parameters, resulting in the difficulty of obtaining a globally optimal solution. To address this issue, a convex dual network is designed here. We theoretically analyze the network convexity and prove that the strong duality holds. Extensive results on both simulation and real-world datasets show that strong duality holds, the dual network does not depend on initialization and optimizer, and enables faster convergence than the state-of-the-art two-layer network. This work provides a new way to convexify soft-thresholding neural networks. Furthermore, the convex dual network model of a deep soft-thresholding network with a parallel structure is deduced.
科研通智能强力驱动
Strongly Powered by AbleSci AI