计算机科学
领域(数学分析)
代表(政治)
接头(建筑物)
人工智能
推荐系统
特征(语言学)
特征学习
学习迁移
相关性(法律)
人气
机器学习
数据挖掘
情报检索
建筑工程
心理学
数学分析
社会心理学
语言学
哲学
数学
政治
政治学
法学
工程类
作者
Shuo Xiao,Dongqing Zhu,Chaogang Tang,Zubing Huang
标识
DOI:10.1007/978-3-031-30672-3_30
摘要
Cross-domain recommendation (CDR) improves recommendation accuracy by transferring knowledge from rich domains to sparse domains, which is a significant advancement in the effort to deal with data sparsity. Existing CDR works, however, still have some challenges, including (1) ignoring the user-item interaction long-tail distribution problem and (2) transferring only the domain-shared feature preferences of common users. In this paper, we propose a CDR framework named joint Cross-Attention Transfer and Contrastive Learning for Cross-Domain Recommendation (CATCL). We first add random uniform noise to the original representation to maximize the mutual information between the original representation and its augmented view, and then pre-train to obtain more uniformly distributed user/item representations, in order to address the issues of data sparsity and popularity bias within intra-domain. In addition, we introduce a cross-attention mechanism for extracting user domain-shared and domain-specific features in order to capture the relevance of user preferences between inter-domain. Then we employ an element-wise attention component that dynamically distributes weights between domain-specific and domain-shared features, allowing different features to exhibit different importance in rich and sparse domains. The final experimental results on three public datasets demonstrate the effectiveness of the proposed framework for many powerful state-of-the-art approaches.
科研通智能强力驱动
Strongly Powered by AbleSci AI