计算机科学
人工智能
机器学习
自然语言处理
语言理解
领域(数学)
数学
纯数学
作者
Kaili Sun,Xudong Luo,Michael Y. Luo
标识
DOI:10.1007/978-3-031-10986-7_36
摘要
With the emergence of Pretrained Language Models (PLMs) and the success of large-scale PLMs such as BERT and GPT, the field of Natural Language Processing (NLP) has achieved tremendous development. Therefore, nowadays, PLMs have become an indispensable technique for solving problems in NLP. In this paper, we survey PLMs to help researchers quickly understand various PLMs and determine the appropriate ones for their specific NLP projects. Specifically, first, we brief on the main machine learning methods used by PLMs. Second, we explore early PLMs and discuss the main state-of-art PLMs. Third, we review several Chinese PLMs. Fourth, we compare the performance of some mainstream PLMs. Fifth, we outline the applications of PLMs. Finally, we give an outlook on the future development of PLMs.
科研通智能强力驱动
Strongly Powered by AbleSci AI