符号
普通话
计算机科学
人工智能
编码(集合论)
自然语言处理
语音识别
程序设计语言
数学
算术
语言学
哲学
集合(抽象数据类型)
作者
Cao Hong Nga,Duc-Quang Vu,Huong Hoang Luong,Chien-Lin Huang,Jia‐Ching Wang
出处
期刊:IEEE Signal Processing Letters
[Institute of Electrical and Electronics Engineers]
日期:2023-01-01
卷期号:30: 1387-1391
被引量:2
标识
DOI:10.1109/lsp.2023.3307350
摘要
Transfer learning is a common method to improve the performance of the model on a target task via pre-training the model on pretext tasks. Different from the methods using monolingual corpora for pre-training, in this study, we propose a Cyclic Transfer Learning method (CTL) that utilizes both code-switching (CS) and monolingual speech resources as the pretext tasks. Moreover, the model in our approach is always alternately learned among these tasks. This helps our model can improve its performance via maintaining CS features during transferring knowledge. The experiment results on the standard SEAME Mandarin-English CS corpus have shown that our proposed CTL approach achieves the best performance with Mixed Error Rate (MER) of 16.3% on test $_{man}$ , 24.1% on test $_{sge}$ . In comparison to the baseline model that was pre-trained with monolingual data, our CTL method achieves 11.4% and 8.7% relative MER reduction on the test $_{man}$ and test $_{sge}$ sets, respectively. Besides, the CTL approach also outperforms compared to other state-of-the-art methods. The source code of the CTL method can be found at https://github.com/caohongnga/CTL-CSSR .
科研通智能强力驱动
Strongly Powered by AbleSci AI