计算机科学
事件(粒子物理)
图形
人工智能
背景(考古学)
机器学习
数据挖掘
自然语言处理
理论计算机科学
量子力学
生物
物理
古生物学
作者
Meng Hu,Lu Bai,Mei Yang
标识
DOI:10.1109/iaeac54830.2022.9929851
摘要
Script event prediction usually refers to giving a sequence of contextual events and then selecting the most likely subsequent event from multiple candidate events. Usually, script event prediction methods are based on event pairs or event chain methods to build prediction models, but these methods can not fully capture the complex relationship between context events and candidate events. Aiming at the two difficulties of how to fully mine the meaning of events in text data to accurately represent events and how to make full use of the potential information between event nodes in narrative event graph to improve the accuracy of prediction tasks, this paper proposes a script event prediction model Bert-SatGNN, which combines Bert pre-training model, structural self-attention mechanism and narrative event graph. Our model introduces Bert pre-training mechanism into script event prediction for the first time so that it can more accurately represent the event nodes input into the prediction model. And use the multi-head structure self-attention module to learn the structural information of the event nodes in the narrative event graph, so as to capture the potential information of evolutionary events with causal logic, and finally combine the structural information and time series information to predict the final events. In this paper, experiments are carried out on the widely used New York Times data set, and the experimental results show that our models are better than the most advanced methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI