计算机科学
分类器(UML)
人工智能
编码器
试验数据
变压器
学习迁移
训练集
机器学习
深度学习
任务(项目管理)
自然语言处理
工程类
系统工程
电压
电气工程
程序设计语言
操作系统
出处
期刊:Lecture notes in networks and systems
日期:2021-01-01
卷期号:: 517-523
标识
DOI:10.1007/978-3-030-80285-1_59
摘要
User-intent classification is a sub-task in natural language understanding of human-computer dialogue systems. To reduce the data volume requirement of deep learning for intent classification, this paper proposes a transfer learning method for Chinese user-intent classification task, which is based on the Bidirectional Encoder Representations from Transformers (BERT) pre-trained language model. First, a simulation experiment on 31 Chinese participants was implemented to collect first-handed Chinese human-computer conversation data. Then, the data was augmented through back-translation and randomly split into the training dataset, validation dataset and test dataset. Next, the BERT model was fine-tuned into a Chinese user-intent classifier. As a result, the predicting accuracy of the BERT classifier reaches 99.95%, 98.39% and 99.89% on the training dataset, validation dataset and test dataset. The result suggests that the application of BERT transfer learning has reduced the data volume requirement for Chinese intent classification task to a satiable level.
科研通智能强力驱动
Strongly Powered by AbleSci AI