Multi-MCCR: Multiple models regularization for semi-supervised text classification with few labels

计算机科学 正规化(语言学) 人工智能 随机性 机器学习 推论 Kullback-Leibler散度 最大熵原理 对比度(视觉) 分类 一致性(知识库) 交叉熵 模式识别(心理学) 数学 统计
作者
Nai Zhou,Nianmin Yao,Qibin Li,Jian Zhao,Yanan Zhang
出处
期刊:Knowledge Based Systems [Elsevier BV]
卷期号:272: 110588-110588 被引量:3
标识
DOI:10.1016/j.knosys.2023.110588
摘要

Semi-supervised learning has achieved impressive results and is commonly applied in text classifications. However, in situations where labeled texts are exceedingly limited, neural networks are prone to over-fitting due to the non-negligible inconsistency between model training and inference caused by dropout mechanisms that randomly mask some neurons. To alleviate this inconsistency problem, we propose a simple Multiple Models Contrast learning based on Consistent Regularization, named Multi-MCCR, which consists of multiple models with the same structure and a C-BiKL loss strategy. Specifically, one sample first goes through multiple identical models to obtain multiple different output distributions, which enriches the sample output distributions and provides conditions for subsequent consistency approximation. Then, the C-BiKL loss strategy is proposed to minimize the combination of the bidirectional Kullback−−Leibler (BiKL) divergence between the above multiple output distributions and the Cross-Entropy loss on labeled data, which provides consistency constraints (BiKL) for the model and simultaneously ensures correct classification (Cross-Entropy). Through the above setting of multi-model contrast learning, the inconsistency caused by the randomness of dropout between model training and inference is alleviated, thereby avoiding over-fitting and improving the classification ability in scenarios with limited labeled samples. We conducted experiments on six widely-used text classification datasets, including sentiment analysis, topic categorization, and reviews classification, and the experimental results show that our method is universally effective in semi-supervised text classification with limited labeled texts.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
米大王发布了新的文献求助10
刚刚
大模型应助wyx采纳,获得10
1秒前
是但求其爱完成签到,获得积分10
1秒前
2秒前
大海发布了新的文献求助10
3秒前
夜夜完成签到,获得积分10
5秒前
情怀应助张涛采纳,获得10
6秒前
夕杳完成签到,获得积分10
6秒前
无尽夏完成签到 ,获得积分10
8秒前
脑洞疼应助葛稀采纳,获得10
8秒前
8秒前
10秒前
小胖友完成签到,获得积分10
10秒前
快乐邮递员完成签到,获得积分10
10秒前
SciGPT应助赵梦然采纳,获得10
12秒前
周萌发布了新的文献求助20
12秒前
哈哈哈哈发布了新的文献求助10
13秒前
14秒前
annie发布了新的文献求助10
14秒前
喜欢玩辅助完成签到,获得积分10
15秒前
15秒前
16秒前
卓Celina完成签到,获得积分10
17秒前
千俞完成签到 ,获得积分10
17秒前
于是完成签到,获得积分10
17秒前
飞云发布了新的文献求助10
17秒前
18秒前
zdw完成签到,获得积分10
19秒前
炙热含羞草完成签到,获得积分10
20秒前
20秒前
20秒前
徐徐诱之发布了新的文献求助10
20秒前
大海完成签到,获得积分10
23秒前
勤奋柚子完成签到,获得积分10
23秒前
24秒前
24秒前
24秒前
英姑应助在在在在在在1采纳,获得10
26秒前
26秒前
夏先生完成签到 ,获得积分10
26秒前
高分求助中
The Mother of All Tableaux: Order, Equivalence, and Geometry in the Large-scale Structure of Optimality Theory 3000
A new approach to the extrapolation of accelerated life test data 1000
Problems of point-blast theory 400
Indomethacinのヒトにおける経皮吸収 400
北师大毕业论文 基于可调谐半导体激光吸收光谱技术泄漏气体检测系统的研究 390
Phylogenetic study of the order Polydesmida (Myriapoda: Diplopoda) 370
Robot-supported joining of reinforcement textiles with one-sided sewing heads 320
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3997731
求助须知:如何正确求助?哪些是违规求助? 3537261
关于积分的说明 11271137
捐赠科研通 3276409
什么是DOI,文献DOI怎么找? 1806986
邀请新用户注册赠送积分活动 883639
科研通“疑难数据库(出版商)”最低求助积分说明 809982