计算机科学
机器学习
人工智能
变压器
财产(哲学)
人工神经网络
图形
理论计算机科学
量子力学
认识论
物理
哲学
电压
作者
Jian Gao,Zheyuan Shen,Yufeng Xie,Jialiang Lu,Yang Lu,Sikang Chen,Qingyu Bian,Yue Guo,Liteng Shen,Jian Wu,Binbin Zhou,Tingjun Hou,Qiaojun He,Jinxin Che,Xiaowu Dong
摘要
Abstract Predicting the biological properties of molecules is crucial in computer-aided drug development, yet it’s often impeded by data scarcity and imbalance in many practical applications. Existing approaches are based on self-supervised learning or 3D data and using an increasing number of parameters to improve performance. These approaches may not take full advantage of established chemical knowledge and could inadvertently introduce noise into the respective model. In this study, we introduce a more elegant transformer-based framework with focused attention for molecular representation (TransFoxMol) to improve the understanding of artificial intelligence (AI) of molecular structure property relationships. TransFoxMol incorporates a multi-scale 2D molecular environment into a graph neural network + Transformer module and uses prior chemical maps to obtain a more focused attention landscape compared to that obtained using existing approaches. Experimental results show that TransFoxMol achieves state-of-the-art performance on MoleculeNet benchmarks and surpasses the performance of baselines that use self-supervised learning or geometry-enhanced strategies on small-scale datasets. Subsequent analyses indicate that TransFoxMol’s predictions are highly interpretable and the clever use of chemical knowledge enables AI to perceive molecules in a simple but rational way, enhancing performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI