计算机科学
生成语法
推论
合成生物学
变压器
系统生物学
生成模型
人工智能
注释
计算生物学
生物
工程类
电气工程
电压
作者
Haotian Cui,Xiaoming Wang,Hassaan Maan,Kuan Pang,Fengning Luo,Bo Wang
标识
DOI:10.1101/2023.04.30.538439
摘要
Abstract Generative pre-trained models have achieved remarkable success in various domains such as natural language processing and computer vision. Specifically, the combination of large-scale diverse datasets and pre-trained transformers has emerged as a promising approach for developing foundation models. Drawing parallels between linguistic constructs and cellular biology — where texts comprise words, similarly, cells are defined by genes — our study probes the applicability of foundation models to advance cellular biology and genetics research. Utilizing the burgeoning single-cell sequencing data, we have pioneered the construction of a foundation model for single-cell biology, scGPT, which is based on generative pre-trained transformer across a repository of over 33 million cells. Our findings illustrate that scGPT, a generative pre-trained transformer, effectively distills critical biological insights concerning genes and cells. Through the further adaptation of transfer learning, scGPT can be optimized to achieve superior performance across diverse downstream applications. This includes tasks such as cell-type annotation, multi-batch integration, multi-omic integration, genetic perturbation prediction, and gene network inference. The scGPT codebase is publicly available at https://github.com/bowang-lab/scGPT .
科研通智能强力驱动
Strongly Powered by AbleSci AI