人工智能
计算机科学
机器学习
碳足迹
多样性(控制论)
人工神经网络
电
深度学习
深层神经网络
足迹
温室气体
工程类
生态学
古生物学
电气工程
生物
作者
Emma Strubell,Ananya Ganesh,Alan Yuille
出处
期刊:Meeting of the Association for Computational Linguistics
日期:2019-01-01
被引量:1931
摘要
Recent progress in hardware and methodology for training neural networks has ushered in a new generation of large networks trained on abundant data. These models have obtained notable gains in accuracy across many NLP tasks. However, these accuracy improvements depend on the availability of exceptionally large computational resources that necessitate similarly substantial energy consumption. As a result these models are costly to train and develop, both financially, due to the cost of hardware and electricity or cloud compute time, and environmentally, due to the carbon footprint required to fuel modern tensor processing hardware. In this paper we bring this issue to the attention of NLP researchers by quantifying the approximate financial and environmental costs of training a variety of recently successful neural network models for NLP. Based on these findings, we propose actionable recommendations to reduce costs and improve equity in NLP research and practice.
科研通智能强力驱动
Strongly Powered by AbleSci AI