计算机科学
人工智能
计算
领域(数学)
机器学习
人工神经网络
深度学习
电
数据科学
可再生能源
工程类
算法
数学
电气工程
纯数学
作者
Emma Strubell,Ananya Ganesh,Alan Yuille
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence
[Association for the Advancement of Artificial Intelligence (AAAI)]
日期:2020-04-03
卷期号:34 (09): 13693-13696
被引量:241
标识
DOI:10.1609/aaai.v34i09.7123
摘要
The field of artificial intelligence has experienced a dramatic methodological shift towards large neural networks trained on plentiful data. This shift has been fueled by recent advances in hardware and techniques enabling remarkable levels of computation, resulting in impressive advances in AI across many applications. However, the massive computation required to obtain these exciting results is costly both financially, due to the price of specialized hardware and electricity or cloud compute time, and to the environment, as a result of non-renewable energy used to fuel modern tensor processing hardware. In a paper published this year at ACL, we brought this issue to the attention of NLP researchers by quantifying the approximate financial and environmental costs of training and tuning neural network models for NLP (Strubell, Ganesh, and McCallum 2019). In this extended abstract, we briefly summarize our findings in NLP, incorporating updated estimates and broader information from recent related publications, and provide actionable recommendations to reduce costs and improve equity in the machine learning and artificial intelligence community.
科研通智能强力驱动
Strongly Powered by AbleSci AI