计算机科学
模棱两可
人工智能
自然语言处理
符号
班级(哲学)
特征工程
直觉
机器学习
情报检索
深度学习
程序设计语言
数学
哲学
算术
认识论
作者
Yi Zhu,Ye Wang,Jipeng Qiang,Xindong Wu
出处
期刊:IEEE Transactions on Knowledge and Data Engineering
[Institute of Electrical and Electronics Engineers]
日期:2023-11-23
卷期号:: 1-13
被引量:13
标识
DOI:10.1109/tkde.2023.3332787
摘要
In the short text, the extremely short length, feature sparsity, and high ambiguity pose huge challenges to classification tasks. Recently, as an effective method for tuning Pre-trained Language Models for specific downstream tasks, prompt-learning has attracted a vast amount of attention and research. The main intuition behind the prompt-learning is to insert the template into the input and convert the tasks into equivalent cloze-style tasks. However, most prompt-learning methods only consider the class name and monotonous strategy for knowledge incorporating in cloze-style prediction, which will inevitably incur omissions and bias in short text classification tasks. In this paper, we propose a short text classification method with prompt-learning. Specifically, the top $M$ concepts related to the entity in the short text are retrieved from the open Knowledge Graph like Probase, these concepts are first selected by the distance with class labels, which takes both the short text itself and the class name into consideration during expanding label word space. Then, we conducted four additional strategies for the integration of the expanded concepts, and the union of these concepts are adopted finally in the verbalizer of prompt-learning. Experimental results show that the obvious improvement is obtained compared with other state-of-the-art methods on five well-known datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI