因果关系(物理学)
计算机科学
事件(粒子物理)
联想(心理学)
人工智能
自然语言处理
频道(广播)
人工神经网络
过渡(遗传学)
机器学习
心理学
物理
生物化学
化学
量子力学
心理治疗师
基因
计算机网络
作者
Jianqi Gao,Hang Yu,Shuang Zhang
标识
DOI:10.1016/j.knosys.2022.109935
摘要
Event Causality Extraction (ECE) plays an essential role in many Natural Language Processing (NLP), such as event prediction and dialogue generation. Recent research in NLP treats ECE as a sequence labeling problem. However, these methods tend to extract the events and their relevant causality using a single collapsed model, which usually focuses on the textual contents while ignoring the intra-element transitions inside events and inter-event causality transition association across events. In general, ECE should condense the complex relationship of intra-event and the causality transition association among events. Therefore, we propose a novel dual-channel enhanced neural network to address this limitation by taking both global event mentions and causality transition association into account. To extract complete event mentions, a Textual Enhancement Channel(TEC) is constructed to learn important intra-event features from the training data with a wider perception field. Then the Knowledge Enhancement Channel(KEC) incorporates external causality transition knowledge using a Graph Convolutional Network (GCN) to provide complementary information on event causality. Finally, we design a dynamic fusion attention mechanism to measure the importance of the two channels. Thus, our proposed model can incorporate both semantic-level and knowledge-level representations of events to extract the relevant event causality. Experimental results on three public datasets show that our model outperforms the state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI