计算机科学
正规化(语言学)
人工智能
随机性
机器学习
推论
Kullback-Leibler散度
最大熵原理
对比度(视觉)
分类
一致性(知识库)
交叉熵
模式识别(心理学)
数学
统计
作者
Nai Zhou,Nianmin Yao,Qibin Li,Jian Zhao,Yanan Zhang
标识
DOI:10.1016/j.knosys.2023.110588
摘要
Semi-supervised learning has achieved impressive results and is commonly applied in text classifications. However, in situations where labeled texts are exceedingly limited, neural networks are prone to over-fitting due to the non-negligible inconsistency between model training and inference caused by dropout mechanisms that randomly mask some neurons. To alleviate this inconsistency problem, we propose a simple Multiple Models Contrast learning based on Consistent Regularization, named Multi-MCCR, which consists of multiple models with the same structure and a C-BiKL loss strategy. Specifically, one sample first goes through multiple identical models to obtain multiple different output distributions, which enriches the sample output distributions and provides conditions for subsequent consistency approximation. Then, the C-BiKL loss strategy is proposed to minimize the combination of the bidirectional Kullback−−Leibler (BiKL) divergence between the above multiple output distributions and the Cross-Entropy loss on labeled data, which provides consistency constraints (BiKL) for the model and simultaneously ensures correct classification (Cross-Entropy). Through the above setting of multi-model contrast learning, the inconsistency caused by the randomness of dropout between model training and inference is alleviated, thereby avoiding over-fitting and improving the classification ability in scenarios with limited labeled samples. We conducted experiments on six widely-used text classification datasets, including sentiment analysis, topic categorization, and reviews classification, and the experimental results show that our method is universally effective in semi-supervised text classification with limited labeled texts.
科研通智能强力驱动
Strongly Powered by AbleSci AI