计算机科学
蒸馏
关系(数据库)
关系抽取
人工智能
注释
约束(计算机辅助设计)
构造(python库)
简单(哲学)
任务(项目管理)
噪音(视频)
机器学习
模式识别(心理学)
数据挖掘
图像(数学)
色谱法
数学
几何学
哲学
经济
化学
管理
程序设计语言
认识论
作者
Rui Li,Cheng Yang,Tingwei Li,Sen Su
摘要
Relation extraction (RE), an important information extraction task, faced the great challenge brought by limited annotation data. To this end, distant supervision was proposed to automatically label RE data, and thus largely increased the number of annotated instances. Unfortunately, lots of noise relation annotations brought by automatic labeling become a new obstacle. Some recent studies have shown that the teacher-student framework of knowledge distillation can alleviate the interference of noise relation annotations via label softening. Nevertheless, we find that they still suffer from two problems: propagation of inaccurate dark knowledge and constraint of a unified distillation temperature . In this article, we propose a simple and effective Multi-instance Dynamic Temperature Distillation (MiDTD) framework, which is model-agnostic and mainly involves two modules: multi-instance target fusion (MiTF) and dynamic temperature regulation (DTR). MiTF combines the teacher’s predictions for multiple sentences with the same entity pair to amend the inaccurate dark knowledge in each student’s target. DTR allocates alterable distillation temperatures to different training instances to enable the softness of most student’s targets to be regulated to a moderate range. In experiments, we construct three concrete MiDTD instantiations with BERT, PCNN, and BiLSTM-based RE models, and the distilled students significantly outperform their teachers and the state-of-the-art (SOTA) methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI