计算机科学
过度拟合
偏爱
人工智能
机器学习
记忆
适应(眼睛)
适应性
任务(项目管理)
水准点(测量)
人机交互
人工神经网络
工程类
大地测量学
数学教育
物理
经济
微观经济学
光学
生物
系统工程
地理
数学
生态学
作者
Chunyang Wang,Yanmin Zhu,Aixin Sun,Zhaobo Wang,Ke Wang
标识
DOI:10.1145/3539618.3591627
摘要
The issue of user cold-start poses a long-standing challenge to recommendation systems, due to the scarce interactions of new users. Recently, meta-learning based studies treat each cold-start user as a user-specific few-shot task and then derive meta-knowledge about fast model adaptation across training users. However, existing solutions mostly do not clearly distinguish the concept of new users and the concept of novel preferences, leading to over-reliance on meta-learning based adaptability to novel patterns. In addition, we also argue that the existing meta-training task construction inherently suffers from the memorization overfitting issue, which inevitably hinders meta-generalization to new users. In response to the aforementioned issues, we propose a preference learning decoupling framework, which is enhanced with meta-augmentation (PDMA), for user cold-start recommendation. To rescue the meta-learning from unnecessary adaptation to common patterns, our framework decouples preference learning for a cold-start user into two complementary aspects: common preference transfer, and novel preference adaptation. To handle the memorization overfitting issue, we further propose to augment meta-training users by injecting attribute-based noises, to achieve mutually-exclusive tasks. Extensive experiments on benchmark datasets demonstrate that our framework achieves superior performance improvements against state-of-the-art methods. We also show that our proposed framework is effective in alleviating memorization overfitting.
科研通智能强力驱动
Strongly Powered by AbleSci AI