计算机科学
本体论
知识抽取
开放式知识库连接
桥(图论)
知识图
领域知识
图形
知识库
情报检索
人工智能
机器学习
知识管理
理论计算机科学
个人知识管理
医学
组织学习
哲学
认识论
内科学
作者
Hongbin Ye,Ningyu Zhang,Shumin Deng,Xiang Chen,Hui Chen,Feiyu Xiong,Xi Chen,Huajun Chen
标识
DOI:10.1145/3485447.3511921
摘要
Few-shot Learning (FSL) is aimed to make predictions based on a limited number of samples. Structured data such as knowledge graphs and ontology libraries has been leveraged to benefit the few-shot setting in various tasks. However, the priors adopted by the existing methods suffer from challenging knowledge missing, knowledge noise, and knowledge heterogeneity, which hinder the performance for few-shot learning. In this study, we explore knowledge injection for FSL with pre-trained language models and propose ontology-enhanced prompt-tuning (OntoPrompt). Specifically, we develop the ontology transformation based on the external knowledge graph to address the knowledge missing issue, which fulfills and converts structure knowledge to text. We further introduce span-sensitive knowledge injection via a visible matrix to select informative knowledge to handle the knowledge noise issue. To bridge the gap between knowledge and text, we propose a collective training algorithm to optimize representations jointly. We evaluate our proposed OntoPrompt in three tasks, including relation extraction, event extraction, and knowledge graph completion, with eight datasets. Experimental results demonstrate that our approach can obtain better few-shot performance than baselines.
科研通智能强力驱动
Strongly Powered by AbleSci AI