生成语法
强化学习
计算机科学
人工智能
生成模型
机器学习
化学空间
变压器
化学
工程类
生物化学
药物发现
电气工程
电压
作者
Jike Wang,Chang‐Yu Hsieh,Mingyang Wang,Xiaorui Wang,Zhenhua Wu,Dejun Jiang,Benben Liao,Xujun Zhang,Bo Yang,Qiaojun He,Dongsheng Cao,Xi Chen,Tingjun Hou
标识
DOI:10.1038/s42256-021-00403-1
摘要
Machine learning-based generative models can generate novel molecules with desirable physiochemical and pharmacological properties from scratch. Many excellent generative models have been proposed, but multi-objective optimizations in molecular generative tasks are still quite challenging for most existing models. Here we proposed the multi-constraint molecular generation (MCMG) approach that can satisfy multiple constraints by combining conditional transformer and reinforcement learning algorithms through knowledge distillation. A conditional transformer was used to train a molecular generative model by efficiently learning and incorporating the structure–property relations into a biased generative process. A knowledge distillation model was then employed to reduce the model’s complexity so that it can be efficiently fine-tuned by reinforcement learning and enhance the structural diversity of the generated molecules. As demonstrated by a set of comprehensive benchmarks, MCMG is a highly effective approach to traverse large and complex chemical space in search of novel compounds that satisfy multiple property constraints.
科研通智能强力驱动
Strongly Powered by AbleSci AI