特征(语言学)
计算机科学
人工智能
哲学
语言学
作者
Jungang Lou,Rongzhen Qin,Qing Shen,C.K. Sha
出处
期刊:IEEE Transactions on Computational Social Systems
[Institute of Electrical and Electronics Engineers]
日期:2024-04-01
卷期号:11 (2): 2889-2900
被引量:1
标识
DOI:10.1109/tcss.2023.3313622
摘要
A common paradigm is followed by several current click-through rate (CTR) prediction models based on user behavior sequences. They first apply embedding technology to map users' past behavior to low-dimensional dense vectors and then utilize an attention technique to acquire user interest representation from behavior sequences, using current candidates as queries. However, these approaches overemphasize the role of items similar to the candidate items in the historical sequence and ignore the learning of other contextual features as well as the sequential behavior patterns of users. In this article, we present a deep click-through prediction model that incorporates a multigranularity interest activation and implicit feature interactions. Our model first incorporates the nonlinearly extended user representation in the user behavior sequence and uses multiple fully connected layers to obtain the global user interest representation, thereby improving the model's memorization ability for users. Then, a multikernel convolutional network is employed to learn the behavior patterns of the user with different window sizes to solve the problem of pattern diversity and interest mutation noise in behavioral sequences. Finally, the model implements implicit second-order feature interactions across the user-side, item-side, and contextual features via a multihead self-attention network, which can maintain the model's performance in the presence of scarce user behavior sequences. Compared with the benchmark model, deep interest network (DIN), our model achieved RelaImpr gains of 1.67%, 3.36%, and 3.04% on three publicly available datasets and 6.09%, 6.08%, and 10.22% with the elimination of user history behavior sequence information. Experiments and discussions on module ablation and parameters that have a significant impact on model performance are also presented.
科研通智能强力驱动
Strongly Powered by AbleSci AI