计算机科学
推荐系统
协同过滤
机器学习
变压器
图形
生成语法
人工智能
理论计算机科学
量子力学
物理
电压
作者
C. Li,Lianghao Xia,Xubin Ren,Yaowen Ye,Yong Xu,Chao Huang
标识
DOI:10.1145/3539618.3591723
摘要
This paper presents a novel approach to representation learning in recommender systems by integrating generative self-supervised learning with graph transformer architecture. We highlight the importance of high-quality data augmentation with relevant self-supervised pretext tasks for improving performance. Towards this end, we propose a new approach that automates the self-supervision augmentation process through a rationale-aware generative SSL that distills informative user-item interaction patterns. The proposed recommender with Graph Transformer (GFormer) that offers parameterized collaborative rationale discovery for selective augmentation while preserving global-aware user-item relationships. In GFormer, we allow the rationale-aware SSL to inspire graph collaborative filtering with task-adaptive invariant rationalization in graph transformer. The experimental results reveal that our GFormer has the capability to consistently improve the performance over baselines on different datasets. Several in-depth experiments further investigate the invariant rationale-aware augmentation from various aspects. The source code for this work is publicly available at: https://github.com/HKUDS/GFormer.
科研通智能强力驱动
Strongly Powered by AbleSci AI