计算机科学
人工智能
联营
卷积神经网络
钥匙(锁)
任务(项目管理)
自然语言处理
词(群论)
循环神经网络
对比度(视觉)
机器学习
树(集合论)
人工神经网络
模式识别(心理学)
数学分析
哲学
经济
管理
语言学
计算机安全
数学
作者
Siwei Lai,Liheng Xu,Kang Liu,Jun Zhao
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence
[Association for the Advancement of Artificial Intelligence (AAAI)]
日期:2015-02-19
卷期号:29 (1)
被引量:1954
标识
DOI:10.1609/aaai.v29i1.9513
摘要
Text classification is a foundational task in many NLP applications. Traditional text classifiers often rely on many human-designed features, such as dictionaries, knowledge bases and special tree kernels. In contrast to traditional methods, we introduce a recurrent convolutional neural network for text classification without human-designed features. In our model, we apply a recurrent structure to capture contextual information as far as possible when learning word representations, which may introduce considerably less noise compared to traditional window-based neural networks. We also employ a max-pooling layer that automatically judges which words play key roles in text classification to capture the key components in texts. We conduct experiments on four commonly used datasets. The experimental results show that the proposed method outperforms the state-of-the-art methods on several datasets, particularly on document-level datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI