合成生物学
推论
计算生物学
系统生物学
生成语法
注释
计算机科学
变压器
生成模型
语言模型
人工智能
生物
机器学习
工程类
电气工程
电压
作者
Haotian Cui,Xiaoming Wang,Hassaan Maan,Kuan Pang,Fengning Luo,Nan Duan,Bo Wang
出处
期刊:Nature Methods
[Nature Portfolio]
日期:2024-02-26
卷期号:21 (8): 1470-1480
被引量:227
标识
DOI:10.1038/s41592-024-02201-0
摘要
Generative pretrained models have achieved remarkable success in various domains such as language and computer vision. Specifically, the combination of large-scale diverse datasets and pretrained transformers has emerged as a promising approach for developing foundation models. Drawing parallels between language and cellular biology (in which texts comprise words; similarly, cells are defined by genes), our study probes the applicability of foundation models to advance cellular biology and genetic research. Using burgeoning single-cell sequencing data, we have constructed a foundation model for single-cell biology, scGPT, based on a generative pretrained transformer across a repository of over 33 million cells. Our findings illustrate that scGPT effectively distills critical biological insights concerning genes and cells. Through further adaptation of transfer learning, scGPT can be optimized to achieve superior performance across diverse downstream applications. This includes tasks such as cell type annotation, multi-batch integration, multi-omic integration, perturbation response prediction and gene network inference. Pretrained using over 33 million single-cell RNA-sequencing profiles, scGPT is a foundation model facilitating a broad spectrum of downstream single-cell analysis tasks by transfer learning.
科研通智能强力驱动
Strongly Powered by AbleSci AI