Designing neural networks through neuroevolution

神经进化 人工智能 计算机科学 人工神经网络
作者
Kenneth O. Stanley,Jeff Clune,Joel Lehman,Risto Miikkulainen
出处
期刊:Nature Machine Intelligence [Nature Portfolio]
卷期号:1 (1): 24-35 被引量:613
标识
DOI:10.1038/s42256-018-0006-z
摘要

Much of recent machine learning has focused on deep learning, in which neural network weights are trained through variants of stochastic gradient descent. An alternative approach comes from the field of neuroevolution, which harnesses evolutionary algorithms to optimize neural networks, inspired by the fact that natural brains themselves are the products of an evolutionary process. Neuroevolution enables important capabilities that are typically unavailable to gradient-based approaches, including learning neural network building blocks (for example activation functions), hyperparameters, architectures and even the algorithms for learning themselves. Neuroevolution also differs from deep learning (and deep reinforcement learning) by maintaining a population of solutions during search, enabling extreme exploration and massive parallelization. Finally, because neuroevolution research has (until recently) developed largely in isolation from gradient-based neural network research, it has developed many unique and effective techniques that should be effective in other machine learning areas too. This Review looks at several key aspects of modern neuroevolution, including large-scale computing, the benefits of novelty and diversity, the power of indirect encoding, and the field’s contributions to meta-learning and architecture search. Our hope is to inspire renewed interest in the field as it meets the potential of the increasing computation available today, to highlight how many of its ideas can provide an exciting resource for inspiration and hybridization to the deep learning, deep reinforcement learning and machine learning communities, and to explain how neuroevolution could prove to be a critical tool in the long-term pursuit of artificial general intelligence. Deep neural networks have become very successful at certain machine learning tasks partly due to the widely adopted method of training called backpropagation. An alternative way to optimize neural networks is by using evolutionary algorithms, which, fuelled by the increase in computing power, offers a new range of capabilities and modes of learning.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
宁士萧发布了新的文献求助10
1秒前
科研通AI2S应助zwgao采纳,获得10
2秒前
3秒前
3秒前
伊吹风子发布了新的文献求助10
5秒前
7秒前
feedyoursoul发布了新的文献求助10
8秒前
不动僧完成签到,获得积分10
9秒前
9秒前
共享精神应助研友_Z6k7B8采纳,获得10
9秒前
fduqyy完成签到,获得积分10
10秒前
言亦云发布了新的文献求助10
10秒前
Qing完成签到,获得积分10
10秒前
科研通AI2S应助研友_8DWkVZ采纳,获得10
12秒前
宁士萧完成签到,获得积分10
12秒前
完美世界应助自然的致远采纳,获得10
13秒前
14秒前
15秒前
16秒前
直率夏烟发布了新的文献求助20
16秒前
17秒前
小熊不熊发布了新的文献求助10
18秒前
畅快芝麻完成签到,获得积分10
18秒前
19秒前
fuyg发布了新的文献求助10
20秒前
20秒前
wxy完成签到,获得积分10
22秒前
JamesPei应助heye采纳,获得10
23秒前
朴实的新之完成签到,获得积分10
23秒前
bofu完成签到,获得积分10
25秒前
sweet凤梨发布了新的文献求助10
25秒前
zho发布了新的文献求助20
25秒前
26秒前
科目三应助科研通管家采纳,获得10
27秒前
zzzzzzzz应助科研通管家采纳,获得10
27秒前
Aran_Zhang应助科研通管家采纳,获得30
27秒前
27秒前
27秒前
zzzzzzzz应助科研通管家采纳,获得10
27秒前
量子星尘发布了新的文献求助10
29秒前
高分求助中
【提示信息,请勿应助】关于scihub 10000
The Mother of All Tableaux: Order, Equivalence, and Geometry in the Large-scale Structure of Optimality Theory 3000
Social Research Methods (4th Edition) by Maggie Walter (2019) 2390
A new approach to the extrapolation of accelerated life test data 1000
北师大毕业论文 基于可调谐半导体激光吸收光谱技术泄漏气体检测系统的研究 390
Phylogenetic study of the order Polydesmida (Myriapoda: Diplopoda) 370
Robot-supported joining of reinforcement textiles with one-sided sewing heads 360
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 4010512
求助须知:如何正确求助?哪些是违规求助? 3550312
关于积分的说明 11305427
捐赠科研通 3284689
什么是DOI,文献DOI怎么找? 1810836
邀请新用户注册赠送积分活动 886556
科研通“疑难数据库(出版商)”最低求助积分说明 811499