计算机科学
利用
过度拟合
任务(项目管理)
机器学习
人工智能
借口
最佳显著性理论
水准点(测量)
人气
推荐系统
相关性(法律)
大地测量学
经济
政治
社会心理学
管理
法学
地理
心理治疗师
人工神经网络
计算机安全
政治学
心理学
作者
Mingdai Yang,Zhiwei Liu,Liangwei Yang,Xiaolong Liu,Chen Wang,Hao Peng,Philip S. Yu
标识
DOI:10.1145/3616855.3635811
摘要
Although pretraining has garnered significant attention and popularity in recent years, its application in graph-based recommender systems is relatively limited. It is challenging to exploit prior knowledge by pretraining in widely used ID-dependent datasets. On the one hand, user-item interaction history in one dataset can hardly be transferred to other datasets through pretraining, where IDs are different. On the other hand, pretraining and finetuning on the same dataset leads to a high risk of overfitting. In this paper, we propose a novel multitask pretraining framework named Unified Pretraining for Recommendation via Task Hypergraphs. For a unified learning pattern to handle diverse requirements and nuances of various pretext tasks, we design task hypergraphs to generalize pretext tasks to hyperedge prediction. A novel transitional attention layer is devised to discriminatively learn the relevance between each pretext task and recommendation. Experimental results on three benchmark datasets verify the superiority of UPRTH. Additional detailed investigations are conducted to demonstrate the effectiveness of the proposed framework.
科研通智能强力驱动
Strongly Powered by AbleSci AI