计算机科学
条件随机场
命名实体识别
人工智能
特征工程
接头(建筑物)
自然语言处理
变压器
编码器
人工神经网络
特征(语言学)
对话
深度学习
语音识别
语言学
建筑工程
哲学
物理
管理
量子力学
电压
工程类
经济
任务(项目管理)
操作系统
作者
Changxia Ma,Chen Zhang
标识
DOI:10.1142/s0218001421530037
摘要
The current named entity recognition (NER) is mainly based on joint convolution or recurrent neural network. In order to achieve high performance, these networks need to provide a large amount of training data in the form of feature engineering corpus and lexicons. Chinese NER is very challenging because of the high contextual relevance of Chinese characters, that is, Chinese characters and phrases may have many possible meanings in different contexts. To this end, we propose a model that leverages a pre-trained and bi-directional encoder representations-from-transformers language model and a joint bi-directional long short-term memory (Bi-LSTM) and conditional random fields (CRF) model for Chinese NER. The underlying network layer embeds Chinese characters and outputs character-level representations. The output is then fed into a bidirectional long short-term memory to capture contextual sequence information. The top layer of the proposed model is CRF, which is used to take into account the dependencies of adjacent tags and jointly decode the optimal chain of tags. A series of extensive experiments were conducted to research the useful improvements of the proposed neural network architecture on different datasets without relying heavily on handcrafted features and domain-specific knowledge. Experimental results show that the proposed model is effective, and character-level representation is of great significance for Chinese NER tasks. In addition, through this work, we have composed a new informal conversation message corpus called the autonomous bus information inquiry dataset, and compared to the advanced baseline, our method has been significantly improved.
科研通智能强力驱动
Strongly Powered by AbleSci AI