计算机科学
人工智能
命名实体识别
模式识别(心理学)
自然语言处理
任务(项目管理)
经济
管理
作者
Hong-Jin Kim,Harksoo Kim
标识
DOI:10.1016/j.eswa.2024.123657
摘要
Nested named entity recognition (NER) has been mainly studied by recognizing inner entities first, then outer entities, or vice versa, depending on the nesting level. However, the previous models on nested NER do not deal with the data sparseness problems of deep nested entities (i.e., insufficiency of training data for deeper nested entities) and the error propagation problems between nested entities (i.e., accumulation of errors caused by inner entities or outer entities). To alleviate the data sparseness problems, we propose a recursive label attention network. In contrast to previous models, the proposed model explicitly reflects the nested level and effectively re-uses lower level label information through level-reflected label embeddings. To counteract error propagation in the recursive architecture, we propose an inner entity pretraining strategy in which the proposed model is sequentially trained from lower to higher levels. In experiments, we use task-specific metrics for nested NER, unlike the evaluation metrics used in previous studies. The experimental results show that the proposed model is particularly effective in cases where entities of the same type were nested. In addition, it showed high F1-scores even when named entities are deeply nested. This fact reveals that our proposed model is effective for nest NER.
科研通智能强力驱动
Strongly Powered by AbleSci AI