人工神经网络
计算机科学
过程(计算)
建筑
培训(气象学)
人工智能
功能(生物学)
机器学习
互联网
生物
操作系统
物理
进化生物学
万维网
艺术
气象学
视觉艺术
作者
Mengting Wu,Chun‐Wei Tsai
出处
期刊:ICT Express
[Elsevier]
日期:2023-11-10
卷期号:10 (1): 213-231
被引量:5
标识
DOI:10.1016/j.icte.2023.11.001
摘要
The goal of neural architecture search (NAS) is to either downsize the neural architecture and model of a deep neural network (DNN), adjust a neural architecture to improve its end result, or even speed up the whole training process. Such improvements make it possible to generate or install the model of a DNN on a small device, such as a device of internet of things or wireless sensor network. Because most NAS algorithms are time-consuming, finding out a way to reduce their computation costs has now become a critical research issue. The training-free method (also called the zero-shot learning) provides an alternative way to estimate how good a neural architecture is more efficiently during the process of NAS by using a lightweight score function instead of a general training process to avoid incurring heavy costs. This paper starts with a brief discussion of DNN and NAS, followed by a brief review of both model-dependent and model-independent training-free score functions. A brief introduction to the search algorithms and benchmarks that were widely used in a training-free NAS will also be given in this paper. The changes, potential, open issues, and future trends of this research topic are then addressed in the end of this paper.
科研通智能强力驱动
Strongly Powered by AbleSci AI