Random search for hyper-parameter optimization

超参数优化 随机搜索 计算机科学 网格 集合(抽象数据类型) 人工神经网络 分数(化学) 搜索算法 数据挖掘 人工智能 机器学习 算法 数学 化学 几何学 有机化学 支持向量机 程序设计语言
作者
James Bergstra,Yoshua Bengio
出处
期刊:Journal of Machine Learning Research [Crossref Test]
卷期号:13 (1): 281-305 被引量:1268
链接
摘要

Grid search and manual search are the most widely used strategies for hyper-parameter optimization. This paper shows empirically and theoretically that randomly chosen trials are more efficient for hyper-parameter optimization than trials on a grid. Empirical evidence comes from a comparison with a large previous study that used grid search and manual search to configure neural networks and deep belief networks. Compared with neural networks configured by a pure grid search, we find that random search over the same domain is able to find models that are as good or better within a small fraction of the computation time. Granting random search the same computational budget, random search finds better models by effectively searching a larger, less promising configuration space. Compared with deep belief networks configured by a thoughtful combination of manual search and grid search, purely random search over the same 32-dimensional configuration space found statistically equal performance on four of seven data sets, and superior performance on one of seven. A Gaussian process analysis of the function from hyper-parameters to validation set performance reveals that for most data sets only a few of the hyper-parameters really matter, but that different hyper-parameters are important on different data sets. This phenomenon makes grid search a poor choice for configuring algorithms for new data sets. Our analysis casts some light on why recent High Throughput methods achieve surprising success--they appear to search through a large number of hyper-parameters because most hyper-parameters do not matter much. We anticipate that growing interest in large hierarchical models will place an increasing burden on techniques for hyper-parameter optimization; this work shows that random search is a natural baseline against which to judge progress in the development of adaptive (sequential) hyper-parameter optimization algorithms.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
heart发布了新的文献求助10
1秒前
yduan发布了新的文献求助10
1秒前
NicholasZ完成签到,获得积分10
2秒前
双黄应助王九八采纳,获得10
3秒前
hzb完成签到 ,获得积分10
3秒前
唐同学发布了新的文献求助10
4秒前
隐形曼青应助文章多多采纳,获得10
5秒前
yan发布了新的文献求助10
6秒前
8秒前
科目三应助1111采纳,获得10
8秒前
NicholasZ发布了新的文献求助10
9秒前
11秒前
OK啊01发布了新的文献求助10
12秒前
嗯哼应助heart采纳,获得30
12秒前
13秒前
乐乐应助ruuuu采纳,获得10
15秒前
小花dgy完成签到,获得积分10
15秒前
特来骑完成签到,获得积分10
16秒前
赘婿应助JiaxuanDai采纳,获得10
19秒前
20秒前
20秒前
waws关注了科研通微信公众号
20秒前
24秒前
小蘑菇应助温暖南莲采纳,获得10
27秒前
丘比特应助蔗蔗月月采纳,获得10
28秒前
30秒前
充电宝应助Leo在努力采纳,获得30
31秒前
小蘑菇应助开放的大侠采纳,获得10
32秒前
马户的崛起完成签到,获得积分10
33秒前
研友_LN7bvn完成签到,获得积分10
34秒前
SciGPT应助耍酷的荆采纳,获得10
34秒前
35秒前
阿超要努力完成签到 ,获得积分10
36秒前
ruuuu发布了新的文献求助10
37秒前
JiaxuanDai发布了新的文献求助10
40秒前
40秒前
43秒前
43秒前
悦耳冰萍完成签到,获得积分10
44秒前
45秒前
高分求助中
The late Devonian Standard Conodont Zonation 2000
Nickel superalloy market size, share, growth, trends, and forecast 2023-2030 2000
The Lali Section: An Excellent Reference Section for Upper - Devonian in South China 1500
Smart but Scattered: The Revolutionary Executive Skills Approach to Helping Kids Reach Their Potential (第二版) 1000
Very-high-order BVD Schemes Using β-variable THINC Method 830
Mantiden: Faszinierende Lauerjäger Faszinierende Lauerjäger 800
PraxisRatgeber: Mantiden: Faszinierende Lauerjäger 800
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3248513
求助须知:如何正确求助?哪些是违规求助? 2891903
关于积分的说明 8269128
捐赠科研通 2559920
什么是DOI,文献DOI怎么找? 1388768
科研通“疑难数据库(出版商)”最低求助积分说明 650897
邀请新用户注册赠送积分活动 627798