可解释性
注释
计算机科学
稳健性(进化)
人工智能
过度拟合
数据类型
编码器
机器学习
深度学习
计算生物学
人工神经网络
基因
生物
遗传学
操作系统
程序设计语言
作者
Fan Yang,Wenchuan Wang,Fang Wang,Yuan Fang,Duyu Tang,Junzhou Huang,Hui Lü,Jianhua Yao
标识
DOI:10.1038/s42256-022-00534-z
摘要
Annotating cell types on the basis of single-cell RNA-seq data is a prerequisite for research on disease progress and tumour microenvironments. Here we show that existing annotation methods typically suffer from a lack of curated marker gene lists, improper handling of batch effects and difficulty in leveraging the latent gene–gene interaction information, impairing their generalization and robustness. We developed a pretrained deep neural network-based model, single-cell bidirectional encoder representations from transformers (scBERT), to overcome the challenges. Following BERT’s approach to pretraining and fine-tuning, scBERT attains a general understanding of gene–gene interactions by being pretrained on huge amounts of unlabelled scRNA-seq data; it is then transferred to the cell type annotation task of unseen and user-specific scRNA-seq data for supervised fine-tuning. Extensive and rigorous benchmark studies validated the superior performance of scBERT on cell type annotation, novel cell type discovery, robustness to batch effects and model interpretability. Cell type annotation is a core task for single cell RNA-sequencing, but current bioinformatic tools struggle with some of the underlying challenges, including high dimensionality, data sparsity, batch effects and a lack of labels. In a self-supervised approach, a transformer model called scBERT is pretrained on millions of unlabelled public single cell RNA-seq data and then fine-tuned with a small number of labelled samples for cell annotation tasks.
科研通智能强力驱动
Strongly Powered by AbleSci AI