已入深夜,您辛苦了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!祝你早点完成任务,早点休息,好梦!

Joint Pre-Trained Chinese Named Entity Recognition Based on Bi-Directional Language Model

计算机科学 条件随机场 命名实体识别 人工智能 特征工程 接头(建筑物) 自然语言处理 变压器 编码器 人工神经网络 特征(语言学) 对话 深度学习 语音识别 语言学 建筑工程 哲学 物理 管理 量子力学 电压 工程类 经济 任务(项目管理) 操作系统
作者
Changxia Ma,Chen Zhang
出处
期刊:International Journal of Pattern Recognition and Artificial Intelligence [World Scientific]
卷期号:35 (09): 2153003-2153003 被引量:4
标识
DOI:10.1142/s0218001421530037
摘要

The current named entity recognition (NER) is mainly based on joint convolution or recurrent neural network. In order to achieve high performance, these networks need to provide a large amount of training data in the form of feature engineering corpus and lexicons. Chinese NER is very challenging because of the high contextual relevance of Chinese characters, that is, Chinese characters and phrases may have many possible meanings in different contexts. To this end, we propose a model that leverages a pre-trained and bi-directional encoder representations-from-transformers language model and a joint bi-directional long short-term memory (Bi-LSTM) and conditional random fields (CRF) model for Chinese NER. The underlying network layer embeds Chinese characters and outputs character-level representations. The output is then fed into a bidirectional long short-term memory to capture contextual sequence information. The top layer of the proposed model is CRF, which is used to take into account the dependencies of adjacent tags and jointly decode the optimal chain of tags. A series of extensive experiments were conducted to research the useful improvements of the proposed neural network architecture on different datasets without relying heavily on handcrafted features and domain-specific knowledge. Experimental results show that the proposed model is effective, and character-level representation is of great significance for Chinese NER tasks. In addition, through this work, we have composed a new informal conversation message corpus called the autonomous bus information inquiry dataset, and compared to the advanced baseline, our method has been significantly improved.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
huishoushen完成签到 ,获得积分10
1秒前
科研通AI2S应助FLY采纳,获得10
3秒前
4秒前
852应助微光熠采纳,获得10
4秒前
温暖书文完成签到,获得积分10
5秒前
SciGPT应助111采纳,获得10
5秒前
YY发布了新的文献求助30
5秒前
YEM发布了新的文献求助10
5秒前
zhangwenjie完成签到 ,获得积分10
6秒前
慕青应助坚强素采纳,获得30
6秒前
科研通AI2S应助科研通管家采纳,获得30
7秒前
7秒前
ceeray23应助科研通管家采纳,获得10
7秒前
天天快乐应助科研通管家采纳,获得10
7秒前
李爱国应助科研通管家采纳,获得10
7秒前
ceeray23应助科研通管家采纳,获得10
7秒前
FashionBoy应助科研通管家采纳,获得10
7秒前
ceeray23应助科研通管家采纳,获得10
7秒前
打打应助科研通管家采纳,获得10
7秒前
科研通AI6应助科研通管家采纳,获得10
7秒前
科研通AI6应助科研通管家采纳,获得10
7秒前
ceeray23应助科研通管家采纳,获得10
7秒前
清秀的小刺猬应助施少雄采纳,获得10
9秒前
bai发布了新的文献求助20
9秒前
Ql1987发布了新的文献求助10
10秒前
星熠完成签到,获得积分10
10秒前
11秒前
哆面体完成签到,获得积分10
12秒前
AngeW发布了新的文献求助100
16秒前
万能图书馆应助Bearbiscuit采纳,获得10
16秒前
Akim应助Bearbiscuit采纳,获得10
16秒前
大个应助Bearbiscuit采纳,获得10
16秒前
CodeCraft应助Bearbiscuit采纳,获得10
16秒前
李爱国应助Bearbiscuit采纳,获得10
16秒前
斯文败类应助Bearbiscuit采纳,获得10
16秒前
思源应助Bearbiscuit采纳,获得10
16秒前
英俊的铭应助Bearbiscuit采纳,获得10
17秒前
ding应助Bearbiscuit采纳,获得10
17秒前
情怀应助Bearbiscuit采纳,获得10
17秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Binary Alloy Phase Diagrams, 2nd Edition 8000
Encyclopedia of Reproduction Third Edition 3000
Comprehensive Methanol Science Production, Applications, and Emerging Technologies 2000
From Victimization to Aggression 1000
Translanguaging in Action in English-Medium Classrooms: A Resource Book for Teachers 700
Exosomes Pipeline Insight, 2025 500
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5650260
求助须知:如何正确求助?哪些是违规求助? 4780326
关于积分的说明 15051616
捐赠科研通 4809184
什么是DOI,文献DOI怎么找? 2572075
邀请新用户注册赠送积分活动 1528266
关于科研通互助平台的介绍 1487102