可微函数
计算机科学
人工神经网络
建筑
卷积神经网络
随机搜索
深度学习
人工智能
数学优化
算法
数学
艺术
数学分析
视觉艺术
作者
Lunchen Xie,Kaiyu Huang,Fan Xu,Qingjiang Shi
标识
DOI:10.1109/icassp49357.2023.10096612
摘要
Neural Architecture Search (NAS) is a silver bullet in alleviating time consumption and human effort for deep neural network design. It is however challenging to search for good architectures with low consumption. In this paper, we propose a novel NAS framework to address the differentiable neural architecture search problem by inspecting the bi-level problem formulation from scratch. Combined with the Zeroth-Order (ZO) gradient descent technique and implicit gradients, the proposed algorithm can not only reduce search time for suitable architectures than existing works but maintain the final accuracy simultaneously. Experimental results show the efficacy of our proposed ZO-based NAS approach.
科研通智能强力驱动
Strongly Powered by AbleSci AI