计算机科学
图形
兴趣点
情报检索
嵌入
数据挖掘
推荐系统
构造(python库)
人工智能
数据科学
理论计算机科学
程序设计语言
标识
DOI:10.1016/j.eswa.2024.123436
摘要
In the era of big data, information overload poses a challenge, complicating user decision-making. Recommendation systems aim to assist in this process. In recent years, research on point-of-interest (POI) recommendations has been gaining momentum with some studies pointing to issues that need to be resolved. Previous studies often used heterogeneous graphs to learn across different entity types, overlooking same-type entity relationships. Certain studies solely extract raw node features from a single source, thus disregarding information diversity, whereas others employ inappropriate methods that fail to preserve the inherent characteristics of the relevant information in the design of raw inputs. The integration of multiple sources of information can introduce a certain amount of noise into the data; however, the approaches used in related research may not be effective in handling this situation. To address these issues, we propose a hybrid structural graph attention network (HS-GAT) for POI recommendation. In this approach, multisource data are first preprocessed and relevant raw features are initialized. Subsequently, heterogeneous graphs are built for user-POI-POI attributes and POI-user-user attributes. These heterogeneous graphs are aggregated using a dual-attention mechanism, to create embedding matrices for users and POIs, which are then used to construct user-user and POI-POI homogeneous graphs. These graph structures are then combined with user and POI embeddings obtained from heterogeneous graphs and fed into a graph attention network (GAT) , which yields the final embedding representations for users and POIs. Finally, recommendations for POIs are made in the form of inner products. A comprehensive performance evaluation of HS-GAT on the Yelp, Boston, Chicago and London datasets demonstrated that the proposed approach outperforms other state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI