计算机科学
修剪
人工神经网络
人工智能
建筑
机器学习
可扩展性
数据库
艺术
农学
视觉艺术
生物
作者
Grigor Bezirganyan,Hayk Akarmazyan
出处
期刊:Kachaṛ
日期:2022-07-26
卷期号: (1): 203-219
标识
DOI:10.54503/2579-2903-2022.1-203
摘要
Neural networks have contributed to many breakthroughs across several disciplines. Their ease of use and scalability have motivated the development of many techniques in computer vision, natural language processing, audio analysis, etc. The neural network architecture plays a dominant role in its performance, and there have been many advances on designs and strategies for defining efficient neural networks. However, manually tuning neural architectures requires a significant amount of time and expert knowledge. To overcome the difficulty of manually setting up the architecture for a neural network, Neural Architecture Search (NAS) has gained popularity. NAS methods involve three general dimensions, namely search space, search strategies, and performance estimation strategies [1]. Different approaches vary in these dimensions.
科研通智能强力驱动
Strongly Powered by AbleSci AI