人工智能
计算机科学
半监督学习
机器学习
捆绑
监督学习
杠杆(统计)
稳健性(进化)
利用
自然语言处理
图形
集成学习
一般化
数学
人工神经网络
理论计算机科学
数学分析
基因
生物化学
复合材料
化学
材料科学
计算机安全
作者
Chuanjiu Wu,Huanhuan Yuan,Pengpeng Zhao,Jianfeng Qu,Victor S. Sheng,Guanfeng Liu
标识
DOI:10.1109/tcss.2023.3331255
摘要
Self-supervised contrastive learning has well advanced the development of bundle recommendation. However, self-supervised contrastive learning may misclassify some positive samples as negative samples, resulting in suboptimal models. Therefore, the industry has proposed supervised contrastive learning to alleviate this drawback. Inspired by this idea, we seek a more elegant contrastive learning paradigm in the field of recommendation, so we propose a dual-supervised contrastive learning for bundle recommendation (DSCBR), which integrates supervised and self-supervised contrastive learning to exploit the full potential of contrastive learning. Specifically, we first construct self-supervised contrastive learning between the two different views (bundle view and item view), which encourages the alignment of the two separately learned views and boosts the effectiveness of learned representations. Second, we use the interaction information to construct supervised contrastive learning and leverage the bundle–bundle cooccurrence graph for further enhancement. By introducing supervised contrastive learning, our model explicitly models user and bundle proximity in different views, improving the model's robustness and generalization. Finally, we jointly perform self-supervised and supervised contrastive learning across multiple views. Extensive experiments on three public datasets demonstrate the effectiveness of our model.
科研通智能强力驱动
Strongly Powered by AbleSci AI