计算机科学
水准点(测量)
机器学习
人工智能
相似性(几何)
上市(财务)
数据挖掘
大地测量学
财务
经济
图像(数学)
地理
作者
Di Han,Yifan Huang,Junmin Liu,Kai Liao,Kunling Lin
摘要
Since the weight of a self-attention model is not affected by the sequence interval, it can more accurately and completely describe the user interests, so it is widely used in processing sequential recommendation. However, the mainstream self-attention model focuses on the similarity between items when calculating the attention weight of user behavioral patterns but fails to reflect the impact of user sudden drift decisions on the model in time. In this article, we introduce a bias strategy in the self-attention module, referred to as Learning Self-Attention Bias (LSAB) to more accurately learn the fast-changing user behavioral patterns. The introduction of LSAB allows for the adjustment of bias resulting from self-attention weights, leading to enhanced prediction performance in sequential recommendation. In addition, this article designs four attention-weight bias types catering to diverse user behavior preferences. After testing on the benchmark datasets, each bias strategy in LSAB is useful for state-of-the-art and can improve the performance of the models by nearly 5% on average. The source code listing is publicly available at https://gitee.com/kyle-liao/lsab .
科研通智能强力驱动
Strongly Powered by AbleSci AI