计算机科学
元学习(计算机科学)
任务(项目管理)
人工智能
一致性(知识库)
多任务学习
机器学习
资源(消歧)
计算机网络
管理
经济
作者
Yaqi Chen,Hao Zhang,Xukui Yang,Wenlin Zhang,Dan Qu
标识
DOI:10.1007/978-3-031-44693-1_28
摘要
We propose a new meta learning based framework that enhances previous approaches for low-resource speech recognition. Meta-learning has proven to be a powerful paradigm for transferring knowledge from prior tasks to facilitate the learning of a novel task. However, when faced with complex task environments and diverse task learning directions, averaging all task gradients is ineffective at capturing meta-knowledge. To address this challenge, we propose a task-consistent multilingual meta-learning (TCMML) method that adopts the gradient agreement algorithm to direct the model parameters in a direction where tasks have more consistency. If a task’s gradient matches the average gradient, its weight in meta-optimization is increased, and vice versa. Experiments on two datasets demonstrate that our proposed system can achieve comparable or even superior performance to state-of-the-art baselines on low-resource languages, and can easily combine with various meta learning methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI