计算机科学
解析
变压器
自然语言处理
语法
人工智能
命名实体识别
分割
对偶(语法数字)
自然语言
抽象语法
任务(项目管理)
语言学
哲学
经济
电压
物理
管理
量子力学
作者
Yinlong Xiao,Zongcheng Ji,Jianqiang Li
标识
DOI:10.1109/icassp48485.2024.10446771
摘要
Named Entity Recognition (NER) is a fundamental task in natural language processing. Syntax plays a significant role in helping to recognize the boundaries and types of entities. In comparison to English, Chinese NER, due to the absence of explicit delimiters, often faces challenges in determining entity boundaries. Similarly, syntactic parsing results can also lead to errors caused by wrong segmentation. In this paper, we propose the dual-grained syntax-aware Transformer network to mitigate the noise from single-grained syntactic parsing results by incorporating dual-grained syntactic information. Specifically, we first introduce syntax-aware Transformers to model dual-grained syntax-aware features and a contextual Transformer to model contextual features. We then design a triple feature aggregation module to dynamically fuse these features. We validate the effectiveness of our approach on three public datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI