scBERT as a large-scale pretrained deep language model for cell type annotation of single-cell RNA-seq data

可解释性 注释 计算机科学 稳健性(进化) 人工智能 过度拟合 数据类型 编码器 机器学习 深度学习 计算生物学 人工神经网络 基因 生物 遗传学 操作系统 程序设计语言
作者
Fan Yang,Wenchuan Wang,Fang Wang,Yuan Fang,Duyu Tang,Junzhou Huang,Hui Lü,Jianhua Yao
出处
期刊:Nature Machine Intelligence [Springer Nature]
卷期号:4 (10): 852-866 被引量:465
标识
DOI:10.1038/s42256-022-00534-z
摘要

Annotating cell types on the basis of single-cell RNA-seq data is a prerequisite for research on disease progress and tumour microenvironments. Here we show that existing annotation methods typically suffer from a lack of curated marker gene lists, improper handling of batch effects and difficulty in leveraging the latent gene–gene interaction information, impairing their generalization and robustness. We developed a pretrained deep neural network-based model, single-cell bidirectional encoder representations from transformers (scBERT), to overcome the challenges. Following BERT’s approach to pretraining and fine-tuning, scBERT attains a general understanding of gene–gene interactions by being pretrained on huge amounts of unlabelled scRNA-seq data; it is then transferred to the cell type annotation task of unseen and user-specific scRNA-seq data for supervised fine-tuning. Extensive and rigorous benchmark studies validated the superior performance of scBERT on cell type annotation, novel cell type discovery, robustness to batch effects and model interpretability. Cell type annotation is a core task for single cell RNA-sequencing, but current bioinformatic tools struggle with some of the underlying challenges, including high dimensionality, data sparsity, batch effects and a lack of labels. In a self-supervised approach, a transformer model called scBERT is pretrained on millions of unlabelled public single cell RNA-seq data and then fine-tuned with a small number of labelled samples for cell annotation tasks.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
建议保存本图,每天支付宝扫一扫(相册选取)领红包
实时播报
1秒前
Brenda完成签到,获得积分10
5秒前
酸酸完成签到 ,获得积分10
6秒前
Tom完成签到,获得积分10
8秒前
Hello应助Siliang采纳,获得10
10秒前
酸酸关注了科研通微信公众号
11秒前
keyanlv完成签到,获得积分10
16秒前
子苓完成签到 ,获得积分10
16秒前
bing完成签到,获得积分10
17秒前
zxj完成签到,获得积分10
18秒前
hwl26完成签到,获得积分10
19秒前
SARON完成签到 ,获得积分10
22秒前
锥子完成签到,获得积分10
24秒前
路路完成签到 ,获得积分10
26秒前
陶军辉完成签到 ,获得积分10
27秒前
感动清炎完成签到,获得积分10
29秒前
29秒前
wanci应助科研通管家采纳,获得10
29秒前
pluto应助科研通管家采纳,获得10
29秒前
chrisio应助科研通管家采纳,获得10
29秒前
浮游应助科研通管家采纳,获得10
29秒前
852应助科研通管家采纳,获得10
29秒前
Clara应助科研通管家采纳,获得10
29秒前
子车茗应助科研通管家采纳,获得10
29秒前
pluto应助科研通管家采纳,获得10
30秒前
Tao应助科研通管家采纳,获得10
30秒前
BareBear应助科研通管家采纳,获得10
30秒前
pluto应助科研通管家采纳,获得10
30秒前
ludong_0应助科研通管家采纳,获得10
30秒前
无极微光应助科研通管家采纳,获得20
30秒前
BareBear应助科研通管家采纳,获得10
30秒前
pluto应助科研通管家采纳,获得10
30秒前
子车茗应助科研通管家采纳,获得10
30秒前
BareBear应助科研通管家采纳,获得10
30秒前
BareBear应助科研通管家采纳,获得10
30秒前
30秒前
chrisio应助科研通管家采纳,获得10
30秒前
SciGPT应助科研通管家采纳,获得10
30秒前
充电宝应助科研通管家采纳,获得10
30秒前
bkagyin应助科研通管家采纳,获得20
30秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
List of 1,091 Public Pension Profiles by Region 1541
Binary Alloy Phase Diagrams, 2nd Edition 600
Atlas of Liver Pathology: A Pattern-Based Approach 500
A Technologist’s Guide to Performing Sleep Studies 500
Latent Class and Latent Transition Analysis: With Applications in the Social, Behavioral, and Health Sciences 500
Using Genomics to Understand How Invaders May Adapt: A Marine Perspective 400
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 物理化学 基因 遗传学 催化作用 冶金 量子力学 光电子学
热门帖子
关注 科研通微信公众号,转发送积分 5498677
求助须知:如何正确求助?哪些是违规求助? 4595836
关于积分的说明 14450003
捐赠科研通 4528827
什么是DOI,文献DOI怎么找? 2481735
邀请新用户注册赠送积分活动 1465732
关于科研通互助平台的介绍 1438581