计算机科学
学习迁移
人工智能
任务(项目管理)
机器学习
推论
劣势
自然语言处理
知识转移
深度学习
稀缺
培训转移
知识管理
经济
微观经济学
管理
作者
Moritz Laurer,Wouter van Atteveldt,Andreu Casas,Kasper Welbers
出处
期刊:Political Analysis
[Cambridge University Press]
日期:2023-06-09
卷期号:: 1-17
被引量:4
摘要
Abstract Supervised machine learning is an increasingly popular tool for analyzing large political text corpora. The main disadvantage of supervised machine learning is the need for thousands of manually annotated training data points. This issue is particularly important in the social sciences where most new research questions require new training data for a new task tailored to the specific research question. This paper analyses how deep transfer learning can help address this challenge by accumulating “prior knowledge” in language models. Models like BERT can learn statistical language patterns through pre-training (“language knowledge”), and reliance on task-specific data can be reduced by training on universal tasks like natural language inference (NLI; “task knowledge”). We demonstrate the benefits of transfer learning on a wide range of eight tasks. Across these eight tasks, our BERT-NLI model fine-tuned on 100 to 2,500 texts performs on average 10.7 to 18.3 percentage points better than classical models without transfer learning. Our study indicates that BERT-NLI fine-tuned on 500 texts achieves similar performance as classical models trained on around 5,000 texts. Moreover, we show that transfer learning works particularly well on imbalanced data. We conclude by discussing limitations of transfer learning and by outlining new opportunities for political science research.
科研通智能强力驱动
Strongly Powered by AbleSci AI