频道(广播)
计算机科学
建筑
可微函数
卷积(计算机科学)
卷积神经网络
钥匙(锁)
人工智能
计算机工程
计算机网络
人工神经网络
数学
计算机安全
艺术
数学分析
视觉艺术
作者
Yu Xue,Changchang Lu,Ferrante Neri,Jiafeng Qin
出处
期刊:IEEE transactions on emerging topics in computational intelligence
[Institute of Electrical and Electronics Engineers]
日期:2023-08-17
卷期号:8 (1): 32-43
被引量:13
标识
DOI:10.1109/tetci.2023.3301395
摘要
Neural architecture search has attracted great attention in the research community and has been successfully applied in the industry recently. Differentiable architecture search (DARTS) is an efficient architecture search method. However, the networks searched by DARTS are often unstable due to the large gap in the architecture depth between the search phase and the verification phase. In addition, due to unfair exclusive competition between different candidate operations, DARTS is prone to skip connection aggregation, which may cause performance collapse. In this article, we propose progressive partial channel connections based on channel attention for differentiable architecture search (PA-DARTS) to solve the above problems. In the early stage of searching, we only select a few key channels for convolution using channel attention and reserve all candidate operations. As the search progresses, we gradually increase the number of channels and eliminate unpromising candidate operations to ensure that the search phase and verification phase are all carried out on 20 cells. Due to the existence of the partial channel connections based on channel attention, we can eliminate the unfair competition between operations and increase the stability of PA-DARTS. Experimental results showed that PA-DARTS could achieve 97.59% and 83.61% classification accuracy on CIFAR-10 and CIFAR-100, respectively. On ImageNet, our algorithm achieved 75.3% classification accuracy.
科研通智能强力驱动
Strongly Powered by AbleSci AI