图形
曲率
计算机科学
心理学
数学
理论计算机科学
几何学
作者
Yili Chen,Wan Zheng,Yangyang Li,Xiao He,Xian Wei,Jun Han
标识
DOI:10.1021/acs.jcim.4c01616
摘要
Graph neural networks (GNNs) have revolutionized drug discovery in chemistry and biology, enhancing efficiency and reducing resource demands. However, classical GNNs often struggle to capture long-range dependencies due to challenges like oversmoothing and oversquashing. Graph Transformers address these issues by employing global self-attention mechanisms that allow direct information exchange between any pair of nodes, enabling the modeling of long-range interactions. Despite this, Graph Transformers often face difficulties in capturing the nuanced structural information on graphs. To overcome these challenges, we introduce the CurvFlow-Transformer, a novel graph Transformer model incorporating a curvature flow-based masked attention mechanism. By leveraging a topologically enhanced mask matrix, the attention layer can effectively detect subtle structural differences within graphs, balancing the focus between global mutual information and local structural details of molecules. The CurvFlow-Transformer demonstrates superior performance on the MoleculeNet data set, surpassing several state-of-the-art models across various tasks. Moreover, the model provides unique insights into the relationship between molecular structure and chemical properties by analyzing the attention heat coefficients of individual atoms.
科研通智能强力驱动
Strongly Powered by AbleSci AI