Random search for hyper-parameter optimization

超参数优化 随机搜索 计算机科学 网格 集合(抽象数据类型) 人工神经网络 分数(化学) 搜索算法 数据挖掘 人工智能 机器学习 算法 数学 几何学 支持向量机 有机化学 化学 程序设计语言
作者
James Bergstra,Yoshua Bengio
出处
期刊:Journal of Machine Learning Research [Crossref Test]
卷期号:13 (1): 281-305 被引量:1268
链接
摘要

Grid search and manual search are the most widely used strategies for hyper-parameter optimization. This paper shows empirically and theoretically that randomly chosen trials are more efficient for hyper-parameter optimization than trials on a grid. Empirical evidence comes from a comparison with a large previous study that used grid search and manual search to configure neural networks and deep belief networks. Compared with neural networks configured by a pure grid search, we find that random search over the same domain is able to find models that are as good or better within a small fraction of the computation time. Granting random search the same computational budget, random search finds better models by effectively searching a larger, less promising configuration space. Compared with deep belief networks configured by a thoughtful combination of manual search and grid search, purely random search over the same 32-dimensional configuration space found statistically equal performance on four of seven data sets, and superior performance on one of seven. A Gaussian process analysis of the function from hyper-parameters to validation set performance reveals that for most data sets only a few of the hyper-parameters really matter, but that different hyper-parameters are important on different data sets. This phenomenon makes grid search a poor choice for configuring algorithms for new data sets. Our analysis casts some light on why recent High Throughput methods achieve surprising success--they appear to search through a large number of hyper-parameters because most hyper-parameters do not matter much. We anticipate that growing interest in large hierarchical models will place an increasing burden on techniques for hyper-parameter optimization; this work shows that random search is a natural baseline against which to judge progress in the development of adaptive (sequential) hyper-parameter optimization algorithms.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
了然完成签到 ,获得积分10
刚刚
jxp完成签到,获得积分10
刚刚
jojo完成签到 ,获得积分10
1秒前
1秒前
勤劳落雁完成签到 ,获得积分10
1秒前
4秒前
爆米花应助科研通管家采纳,获得30
4秒前
顾矜应助科研通管家采纳,获得10
4秒前
4秒前
5秒前
田様应助科研通管家采纳,获得10
5秒前
科目三应助科研通管家采纳,获得10
5秒前
李爱国应助科研通管家采纳,获得10
5秒前
打打应助科研通管家采纳,获得10
5秒前
RC_Wang应助科研通管家采纳,获得10
5秒前
科研通AI5应助科研通管家采纳,获得10
5秒前
5秒前
星辰大海应助科研通管家采纳,获得10
5秒前
CipherSage应助科研通管家采纳,获得10
5秒前
赘婿应助Quzhengkai采纳,获得10
5秒前
sutharsons应助科研通管家采纳,获得30
5秒前
李爱国应助科研通管家采纳,获得30
6秒前
6秒前
6秒前
调研昵称发布了新的文献求助10
6秒前
CodeCraft应助清新的苑博采纳,获得10
7秒前
所所应助Chen采纳,获得10
8秒前
10秒前
10秒前
goldenfleece发布了新的文献求助10
10秒前
怕黑的钥匙完成签到 ,获得积分10
10秒前
zhangsf88完成签到,获得积分10
10秒前
科研通AI5应助科研小能手采纳,获得10
10秒前
乐乐应助热情芷荷采纳,获得10
11秒前
想发sci完成签到,获得积分10
11秒前
kaifeiQi完成签到,获得积分10
11秒前
共享精神应助Elsa采纳,获得10
11秒前
11秒前
Owen应助怎么可能会凉采纳,获得10
12秒前
小马甲应助ATAYA采纳,获得10
13秒前
高分求助中
Continuum Thermodynamics and Material Modelling 3000
Production Logging: Theoretical and Interpretive Elements 2700
Social media impact on athlete mental health: #RealityCheck 1020
Ensartinib (Ensacove) for Non-Small Cell Lung Cancer 1000
Unseen Mendieta: The Unpublished Works of Ana Mendieta 1000
Bacterial collagenases and their clinical applications 800
El viaje de una vida: Memorias de María Lecea 800
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 基因 遗传学 物理化学 催化作用 量子力学 光电子学 冶金
热门帖子
关注 科研通微信公众号,转发送积分 3527961
求助须知:如何正确求助?哪些是违规求助? 3108159
关于积分的说明 9287825
捐赠科研通 2805882
什么是DOI,文献DOI怎么找? 1540070
邀请新用户注册赠送积分活动 716926
科研通“疑难数据库(出版商)”最低求助积分说明 709808