计算机科学
机器学习
人工智能
图形
混淆矩阵
嵌入
移动社交网络
数据挖掘
移动计算
理论计算机科学
计算机网络
作者
Xinxin Hu,Haotian Chen,Hongchang Chen,Shuxin Liu,Xing Li,Shibo Zhang,Yahui Wang,Xiangyang Xue
出处
期刊:IEEE Transactions on Computational Social Systems
[Institute of Electrical and Electronics Engineers]
日期:2023-08-17
卷期号:11 (2): 2675-2690
被引量:10
标识
DOI:10.1109/tcss.2023.3302651
摘要
In recent years, the increasing prevalence of mobile social network fraud has led to significant distress and depletion of personal and social wealth, resulting in considerable economic harm. Graph neural networks (GNNs) have emerged as a popular approach to tackle this issue. However, the challenge of graph imbalance, which can greatly impede the effectiveness of GNN-based fraud detection methods, has received little attention in prior research. Thus, we are going to present a novel cost-sensitive graph neural network (CSGNN) in this article. Initially, reinforcement learning is utilized to train a suitable sampling threshold, followed by neighbor sampling based on node similarity, which helps to alleviate the graph imbalance issue preliminarily. Subsequently, message aggregation is executed on the sampled graph using GNN to obtain node embeddings. Concurrently, the optimization objective for the cost matrix is formulated using the sample histogram matrix, scatter matrix, and confusion matrix. The cost matrix and GNN are collaboratively optimized through the backpropagation algorithm. Ultimately, the derived cost-sensitive node embedding is employed for fraudulent node detection. Furthermore, this study provides a theoretical demonstration of the effectiveness of adaptive cost-sensitive learning in GNN. Extensive experiments are carried out on two publicly accessible real-world mobile network fraud datasets, revealing that the proposed CSGNN effectively addresses the graph imbalance issue while outperforming state-of-the-art algorithms in detection performance. The CSGNN code and datasets can be accessed at https://github.com/xxhu94/CSGNN.
科研通智能强力驱动
Strongly Powered by AbleSci AI