计算机科学
稳健性(进化)
机器学习
人工智能
噪音(视频)
关系(数据库)
抓住
编码(集合论)
数据挖掘
生物化学
基因
图像(数学)
集合(抽象数据类型)
化学
程序设计语言
作者
Tianyu Gao,Xu Han,Zhiyuan Liu,Maosong Sun
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence
[Association for the Advancement of Artificial Intelligence (AAAI)]
日期:2019-07-17
卷期号:33 (01): 6407-6414
被引量:214
标识
DOI:10.1609/aaai.v33i01.33016407
摘要
The existing methods for relation classification (RC) primarily rely on distant supervision (DS) because large-scale supervised training datasets are not readily available. Although DS automatically annotates adequate amounts of data for model training, the coverage of this data is still quite limited, and meanwhile many long-tail relations still suffer from data sparsity. Intuitively, people can grasp new knowledge by learning few instances. We thus provide a different view on RC by formalizing RC as a few-shot learning (FSL) problem. However, the current FSL models mainly focus on low-noise vision tasks, which makes them hard to directly deal with the diversity and noise of text. In this paper, we propose hybrid attention-based prototypical networks for the problem of noisy few-shot RC. We design instancelevel and feature-level attention schemes based on prototypical networks to highlight the crucial instances and features respectively, which significantly enhances the performance and robustness of RC models in a noisy FSL scenario. Besides, our attention schemes accelerate the convergence speed of RC models. Experimental results demonstrate that our hybrid attention-based models require fewer training iterations and outperform the state-of-the-art baseline models. The code and datasets are released on https://github.com/thunlp/ HATT-Proto.
科研通智能强力驱动
Strongly Powered by AbleSci AI