期刊:Lecture notes in networks and systems日期:2021-01-01卷期号:: 517-523
标识
DOI:10.1007/978-3-030-80285-1_59
摘要
User-intent classification is a sub-task in natural language understanding of human-computer dialogue systems. To reduce the data volume requirement of deep learning for intent classification, this paper proposes a transfer learning method for Chinese user-intent classification task, which is based on the Bidirectional Encoder Representations from Transformers (BERT) pre-trained language model. First, a simulation experiment on 31 Chinese participants was implemented to collect first-handed Chinese human-computer conversation data. Then, the data was augmented through back-translation and randomly split into the training dataset, validation dataset and test dataset. Next, the BERT model was fine-tuned into a Chinese user-intent classifier. As a result, the predicting accuracy of the BERT classifier reaches 99.95%, 98.39% and 99.89% on the training dataset, validation dataset and test dataset. The result suggests that the application of BERT transfer learning has reduced the data volume requirement for Chinese intent classification task to a satiable level.