图形
计算机科学
图形核
数学
理论计算机科学
人工智能
核方法
多项式核
支持向量机
作者
Lixiang Xu,Haifeng Liu,Xin Yuan,Enhong Chen,Yuan Yan Tang
出处
期刊:IEEE transactions on cybernetics
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:: 1-13
标识
DOI:10.1109/tcyb.2024.3465213
摘要
While highly influential in deep learning, especially in natural language processing, the Transformer model has not exhibited competitive performance in unsupervised graph representation learning (UGRL). Conventional approaches, which focus on local substructures on the graph, offer simplicity but often fall short in encapsulating comprehensive structural information of the graph. This deficiency leads to suboptimal generalization performance. To address this, we proposed the GraKerformer model, a variant of the standard Transformer architecture, to mitigate the shortfall in structural information representation and enhance the performance in UGRL. By leveraging the shortest-path graph kernel (SPGK) to weight attention scores and combining graph neural networks, the GraKerformer effectively encodes the nuanced structural information of graphs. We conducted evaluations on the benchmark datasets for graph classification to validate the superior performance of our approach.
科研通智能强力驱动
Strongly Powered by AbleSci AI