反向
生成模型
图形
算法
标量(数学)
计算机科学
生成语法
人工智能
数学
理论计算机科学
几何学
作者
NULL AUTHOR_ID,NULL AUTHOR_ID,Bowen Du,NULL AUTHOR_ID
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2024-07-08
卷期号:35 (9): 11857-11871
被引量:4
标识
DOI:10.1109/tnnls.2024.3416328
摘要
Designing new molecules is essential for drug discovery and material science. Recently, deep generative models that aim to model molecule distribution have made promising progress in narrowing down the chemical research space and generating high-fidelity molecules. However, current generative models only focus on modeling 2-D bonding graphs or 3-D geometries, which are two complementary descriptors for molecules. The lack of ability to jointly model them limits the improvement of generation quality and further downstream applications. In this article, we propose a joint 2-D and 3-D graph diffusion model (JODO) that generates geometric graphs representing complete molecules with atom types, formal charges, bond information, and 3-D coordinates. To capture the correlation between 2-D molecular graphs and 3-D geometries in the diffusion process, we develop a diffusion graph transformer (DGT) to parameterize the data prediction model that recovers the original data from noisy data. The DGT uses a relational attention mechanism that enhances the interaction between node and edge representations. This mechanism operates concurrently with the propagation and update of scalar attributes and geometric vectors. Our model can also be extended for inverse molecular design targeting single or multiple quantum properties. In our comprehensive evaluation pipeline for unconditional joint generation, the experimental results show that JODO remarkably outperforms the baselines on the QM9 and GEOM-Drugs datasets. Furthermore, our model excels in few-step fast sampling, as well as in inverse molecule design and molecular graph generation. Our code is provided in https://github.com/GRAPH-0/JODO.
科研通智能强力驱动
Strongly Powered by AbleSci AI