计算机科学
变压器
序列(生物学)
人工智能
模式识别(心理学)
工程类
电气工程
电压
遗传学
生物
标识
DOI:10.1109/nnice61279.2024.10498558
摘要
In recent years, attention mechanisms have received widespread attention and research, and have been applied to sequential recommendation systems. Although they have achieved excellent performance, there are still some problems. The historical behavior data of users often contains noise data, and the traditional attention-based sequential recommendation systems have a large number of parameters, which makes the models more likely to be affected by noise data, damaging the model performance and consuming a lot of computation time and memory. To solve this problem, this paper uses a Sparse Transformer with a sampling strategy and introduces filter layers with learnable parameters into the models, combining filtering algorithms from signal processing with cross attention mechanisms. In addition, time embedding and position embedding are introduced to enhance the model's ability to learn time information and relative position information in the interaction sequence. To verify the effectiveness of the proposed algorithm, experiments were conducted on three public datasets and good results were achieved.
科研通智能强力驱动
Strongly Powered by AbleSci AI