超参数
计算机科学
机器学习
稳健性(进化)
贝叶斯概率
人工智能
人工神经网络
可扩展性
不确定度量化
回归
数学
统计
生物化学
数据库
基因
化学
作者
Balaji Lakshminarayanan,Alexander Pritzel,Charles Blundell
出处
期刊:Cornell University - arXiv
日期:2016-12-05
被引量:2964
标识
DOI:10.48550/arxiv.1612.01474
摘要
Deep neural networks (NNs) are powerful black box predictors that have\nrecently achieved impressive performance on a wide spectrum of tasks.\nQuantifying predictive uncertainty in NNs is a challenging and yet unsolved\nproblem. Bayesian NNs, which learn a distribution over weights, are currently\nthe state-of-the-art for estimating predictive uncertainty; however these\nrequire significant modifications to the training procedure and are\ncomputationally expensive compared to standard (non-Bayesian) NNs. We propose\nan alternative to Bayesian NNs that is simple to implement, readily\nparallelizable, requires very little hyperparameter tuning, and yields high\nquality predictive uncertainty estimates. Through a series of experiments on\nclassification and regression benchmarks, we demonstrate that our method\nproduces well-calibrated uncertainty estimates which are as good or better than\napproximate Bayesian NNs. To assess robustness to dataset shift, we evaluate\nthe predictive uncertainty on test examples from known and unknown\ndistributions, and show that our method is able to express higher uncertainty\non out-of-distribution examples. We demonstrate the scalability of our method\nby evaluating predictive uncertainty estimates on ImageNet.\n
科研通智能强力驱动
Strongly Powered by AbleSci AI