编码
插补(统计学)
计算机科学
变压器
数据挖掘
杠杆(统计)
空间分析
推论
人工智能
缺少数据
机器学习
生物
数学
基因
统计
工程类
电气工程
电压
生物化学
作者
Wen, Hongzhi,Tang, Wenzhuo,Jin, Wei,Ding, Jiayuan,Liu, Renming,Shi, Feng,Xie, Yuying,Tang, Jiliang
出处
期刊:Cornell University - arXiv
日期:2023-02-05
标识
DOI:10.48550/arxiv.2302.03038
摘要
Spatially resolved transcriptomics brings exciting breakthroughs to single-cell analysis by providing physical locations along with gene expression. However, as a cost of the extremely high spatial resolution, the cellular level spatial transcriptomic data suffer significantly from missing values. While a standard solution is to perform imputation on the missing values, most existing methods either overlook spatial information or only incorporate localized spatial context without the ability to capture long-range spatial information. Using multi-head self-attention mechanisms and positional encoding, transformer models can readily grasp the relationship between tokens and encode location information. In this paper, by treating single cells as spatial tokens, we study how to leverage transformers to facilitate spatial tanscriptomics imputation. In particular, investigate the following two key questions: (1) $\textit{how to encode spatial information of cells in transformers}$, and (2) $\textit{ how to train a transformer for transcriptomic imputation}$. By answering these two questions, we present a transformer-based imputation framework, SpaFormer, for cellular-level spatial transcriptomic data. Extensive experiments demonstrate that SpaFormer outperforms existing state-of-the-art imputation algorithms on three large-scale datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI