关系抽取
计算机科学
人工智能
关系(数据库)
信息抽取
自然语言处理
知识图
一般化
任务(项目管理)
知识抽取
语言模型
数据挖掘
数学
管理
经济
数学分析
作者
Wei Zhao,Qinghui Chen,Junling You
标识
DOI:10.1145/3650400.3650478
摘要
Entity relation extraction aims to extract knowledge triples from unstructured or semi-structured text data and can be applied to various fields, including medicine, finance knowledge graph construction and intelligent question-answering. Traditional entity relation extraction requires a large amount of labeled data, consumes a lot of labor and time, and the trained model lacks generalization ability, which is difficult to migrate to other fields. Zero-shot entity relation extraction relieves the dependence on labeled data in traditional method. Based on unlabeled text data, zero-shot entity relation extraction has strong domain adaptability, which is a very challenging and practical task. Recent work on large language models shows that large models can effectively complete downstream tasks through natural language instructions and have good generalization ability. Inspired by this, we explore the use of large models for information extraction. Due to the randomness of large language model generation, we introduce in-context learning in entity relation extraction task to guide large language model to output data in a specified format to help obtain structured data. At the same time, we propose a three-stage extraction framework for decomposing entity relation extraction tasks, and each stage is conducted in the form of question and answer to reduce the complexity of extraction. We evaluated the knowledge triples extraction performance of the model on three self-built test datasets in different fields, and the experimental result showed that our proposed method achieved impressive performance in the zero-shot entity relation extraction task, surpassing the comparison model on multiple metrics, proving the effectiveness and domain adaptability of the proposed method.
科研通智能强力驱动
Strongly Powered by AbleSci AI