计算机科学
修剪
滤波器(信号处理)
路径(计算)
帕斯卡(单位)
人工神经网络
人工智能
目标检测
机器学习
数据挖掘
计算机工程
模式识别(心理学)
计算机视觉
农学
生物
程序设计语言
作者
Yuiko Sakuma,Masato Ishii,Takuya Narihira
标识
DOI:10.1109/iccvw60793.2023.00143
摘要
We address the challenge of training a large supernet for the object detection task, using a relatively small amount of training data. Specifically, we propose an efficient supernet-based neural architecture search (NAS) method that uses search space pruning. The search space defined by the supernet is pruned by removing candidate models that are predicted to perform poorly. To effectively remove the candidates over a wide range of resource constraints, we particularly design a performance predictor for supernet, called path filter, which is conditioned by resource constraints and can accurately predict the relative performance of the models that satisfy similar resource constraints. Hence, super-net training is more focused on the best-performing candidates. Our path filter handles prediction for paths with different resource budgets. Compared to once-for-all, our proposed method reduces the computational cost of the optimal network architecture by 30% and 63%, while yielding better accuracy-floating point operations Pareto front (0.85 and 0.45 points of improvement on average precision for Pascal VOC and COCO, respectively).
科研通智能强力驱动
Strongly Powered by AbleSci AI