紧凑空间
子空间拓扑
嵌入
计算机科学
可扩展性
相互信息
维数(图论)
理论计算机科学
特征学习
编码(社会科学)
不变(物理)
人工智能
数学
纯数学
数据库
数学物理
统计
作者
Huiyuan Chen,Vivian Lai,Hongye Jin,Zhimeng Jiang,Mahashweta Das,Xia Hu
标识
DOI:10.1145/3616855.3635832
摘要
Contrastive Learning (CL) has shown promising performance in collaborative filtering. The key idea is to generate augmentation-invariant embeddings by maximizing the Mutual Information between different augmented views of the same instance. However, we empirically observe that existing CL models suffer from the \textsl{dimensional collapse} issue, where user/item embeddings only span a low-dimension subspace of the entire feature space. This suppresses other dimensional information and weakens the distinguishability of embeddings. Here we propose a non-contrastive learning objective, named nCL, which explicitly mitigates dimensional collapse of representations in collaborative filtering. Our nCL aims to achieve geometric properties of \textsl{Alignment} and \textsl{Compactness} on the embedding space. In particular, the alignment tries to push together representations of positive-related user-item pairs, while compactness tends to find the optimal coding length of user/item embeddings, subject to a given distortion. More importantly, our nCL does not require data augmentation nor negative sampling during training, making it scalable to large datasets. Experimental results demonstrate the superiority of our nCL.
科研通智能强力驱动
Strongly Powered by AbleSci AI