计算机科学
情态动词
化学空间
变压器
学习迁移
编码器
金属有机骨架
网格
深度学习
测距
人工智能
材料科学
吸附
工程类
电气工程
电信
生物信息学
化学
几何学
数学
有机化学
电压
高分子化学
生物
药物发现
操作系统
作者
Yeonghun Kang,Hyunsoo Park,Berend Smit,Jihan Kim
标识
DOI:10.1038/s42256-023-00628-2
摘要
Metal–organic frameworks (MOFs) are a class of crystalline porous materials that exhibit a vast chemical space owing to their tunable molecular building blocks with diverse topologies. An unlimited number of MOFs can, in principle, be synthesized. Machine learning approaches can help to explore this vast chemical space by identifying optimal candidates with desired properties from structure–property relationships. Here we introduce MOFTransformer, a multi-modal Transformer encoder pre-trained with 1 million hypothetical MOFs. This multi-modal model utilizes integrated atom-based graph and energy-grid embeddings to capture both local and global features of MOFs, respectively. By fine-tuning the pre-trained model with small datasets ranging from 5,000 to 20,000 MOFs, our model achieves state-of-the-art results for predicting across various properties including gas adsorption, diffusion, electronic properties, and even text-mined data. Beyond its universal transfer learning capabilities, MOFTransformer generates chemical insights by analyzing feature importance through attention scores within the self-attention layers. As such, this model can serve as a platform for other MOF researchers that seek to develop new machine learning models for their work. Metal–organic frameworks are of high interest for a range of energy and environmental applications due to their stable gas storage properties. A new machine learning approach based on a pre-trained multi-modal transformer can be fine-tuned with small datasets to predict structure-property relationships and design new metal-organic frameworks for a range of specific tasks.
科研通智能强力驱动
Strongly Powered by AbleSci AI