变压器
化学
生化工程
计算机科学
工程类
工程物理
纳米技术
材料科学
电气工程
电压
作者
Rui Wang,Yujin Ji,Youyong Li,Shuit‐Tong Lee
标识
DOI:10.1021/acs.jpclett.4c03128
摘要
The powerful data processing and pattern recognition capabilities of machine learning (ML) technology have provided technical support for the innovation in computational chemistry. Compared with traditional ML and deep learning (DL) techniques, transformers possess fine-grained feature-capturing abilities, which are able to efficiently and accurately model the dependencies of long-sequence data, simulate complex and diverse chemical spaces, and explore the computational logic behind the data. In this Perspective, we provide an overview of the application of transformer models in computational chemistry. We first introduce the working principle of transformer models and analyze the transformer-based architectures in computational chemistry. Next, we explore the practical applications of the model in a number of specific scenarios such as property prediction and chemical structure generation. Finally, based on these applications and research results, we provide an outlook for the research of this field in the future.
科研通智能强力驱动
Strongly Powered by AbleSci AI