计算机科学
过度拟合
机器学习
人工智能
图形
人工神经网络
理论计算机科学
作者
Xiaohe Li,Zide Fan,Feilong Huang,Xuming Hu,Yawen Deng,Lei Wang,Xinyu Zhao
出处
期刊:Neurocomputing
[Elsevier]
日期:2024-01-05
卷期号:574: 127229-127229
被引量:15
标识
DOI:10.1016/j.neucom.2023.127229
摘要
Graph Neural Network (GNN) stands as an emerging methodology for graph-based learning tasks, particularly for node classification. This study elucidates the susceptibility of GNN to discrepancies arising from imbalanced node labels. Conventional solutions for imbalanced classification, such as resampling, falter in node classification task, primarily due to their negligence of graph structure. Worse still, they often exacerbate the model’s inclination towards overfitting or underfitting, especially in the absence of adequate priori knowledge. To circumvent these limitations, we introduce a novel Graph Neural Network framework with Curriculum Learning (GNN-CL). This framework integrates two pivotal components. Initially, leveraging the principles of smoothness and homophily, we endeavor to procure dependable interpolation nodes and edges via adaptive graph oversampling. For another, we combine the Graph Classification Loss with the Metric Learning Loss, thereby refining the spatial proximity of nodes linked to the minority class in the feature space. Drawing inspiration from curriculum learning, the parameters of these components are dynamically modulated during the training phase to accentuate generalization and discrimination capabilities. Comprehensive evaluations on several widely used graph datasets affirm the superiority of our proposed model, which consistently outperforms the existing state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI