计算机科学
图形
时间轴
Hop(电信)
注意力网络
人工智能
机器学习
理论计算机科学
计算机网络
数学
统计
作者
Chaojie Ji,Ruxin Wang,Rongxiang Zhu,Yunpeng Cai,Hongyan Wu
出处
期刊:Cornell University - arXiv
日期:2020-01-01
被引量:5
标识
DOI:10.48550/arxiv.2004.04333
摘要
Due to the cost of labeling nodes, classifying a node in a sparsely labeled graph while maintaining the prediction accuracy deserves attention. The key point is how the algorithm learns sufficient information from more neighbors with different hop distances. This study first proposes a hop-aware attention supervision mechanism for the node classification task. A simulated annealing learning strategy is then adopted to balance two learning tasks, node classification and the hop-aware attention coefficients, along the training timeline. Compared with state-of-the-art models, the experimental results proved the superior effectiveness of the proposed Hop-aware Supervision Graph Attention Networks (HopGAT) model. Especially, for the protein-protein interaction network, in a 40% labeled graph, the performance loss is only 3.9%, from 98.5% to 94.6%, compared to the fully labeled graph. Extensive experiments also demonstrate the effectiveness of supervised attention coefficient and learning strategies.
科研通智能强力驱动
Strongly Powered by AbleSci AI