计算机科学
情绪分析
深度学习
人工智能
变压器
卷积神经网络
水准点(测量)
自然语言处理
学习迁移
语言模型
分类学(生物学)
机器学习
数据科学
物理
大地测量学
生物
电压
量子力学
植物
地理
作者
Tariq Abdullah,Ahmed Ahmet
摘要
Humans are increasingly integrated with devices that enable the collection of vast unstructured opinionated data. Accurately analysing subjective information from this data is the task of sentiment analysis (an actively researched area in NLP). Deep learning provides a diverse selection of architectures to model sentiment analysis tasks and has surpassed other machine learning methods as the foremast approach for performing sentiment analysis tasks. Recent developments in deep learning architectures represent a shift away from Recurrent and Convolutional neural networks and the increasing adoption of Transformer language models. Utilising pre-trained Transformer language models to transfer knowledge to downstream tasks has been a breakthrough in NLP. This survey applies a task-oriented taxonomy to recent trends in architectures with a focus on the theory, design and implementation. To the best of our knowledge, this is the only survey to cover state-of-the-art Transformer-based language models and their performance on the most widely used benchmark datasets. This survey paper provides a discussion of the open challenges in NLP and sentiment analysis. The survey covers five years from 1st July 2017 to 1st July 2022.
科研通智能强力驱动
Strongly Powered by AbleSci AI