概括性
计算机科学
推荐系统
噪音(视频)
偏爱
降噪
训练集
人工智能
噪声数据
机器学习
理论(学习稳定性)
数据挖掘
数学
统计
图像(数学)
心理学
心理治疗师
作者
Zhuangzhuang He,Yifan Wang,Yonghui Yang,Peijie Sun,Le Wu,Haoyue Bai,Jinqi Gong,Richang Hong,Min Zhang
标识
DOI:10.1145/3637528.3671692
摘要
As its availability and generality in online services, implicit feedback is more commonly used in recommender systems. However, implicit feedback usually presents noisy samples in real-world recommendation scenarios (such as misclicks or non-preferential behaviors), which will affect precise user preference learning. To overcome the noisy samples problem, a popular solution is based on dropping noisy samples in the model training phase, which follows the observation that noisy samples have higher training losses than clean samples. Despite the effectiveness, we argue that this solution still has limits. (1) High training losses can result from model optimization instability or hard samples, not just noisy samples. (2) Completely dropping of noisy samples will aggravate the data sparsity, which lacks full data exploitation.
科研通智能强力驱动
Strongly Powered by AbleSci AI