The Nucleotide Transformer: Building and Evaluating Robust Foundation Models for Human Genomics

基因组学 计算生物学 计算机科学 变压器 基因组 生物 基因 遗传学 工程类 电气工程 电压
作者
Hugo Dalla-Torre,Liam Gonzalez,Javier Mendoza Revilla,Nicolás López Carranza,Adam Henryk Grywaczewski,Francesco Oteri,Christian Dallago,Evan Trop,Hassan Sirelkhatim,Guillaume Richard,Marcin J. Skwark,Karim Beguir,Marie Lopez,Thomas Pierrot
标识
DOI:10.1101/2023.01.11.523679
摘要

Abstract Closing the gap between measurable genetic information and observable traits is a longstand-ing challenge in genomics. Yet, the prediction of molecular phenotypes from DNA sequences alone remains limited and inaccurate, often driven by the scarcity of annotated data and the inability to transfer learnings between prediction tasks. Here, we present an extensive study of foundation models pre-trained on DNA sequences, named the Nucleotide Transformer, rang-ing from 50M up to 2.5B parameters and integrating information from 3,202 diverse human genomes, as well as 850 genomes selected across diverse phyla, including both model and non-model organisms. These transformer models yield transferable, context-specific representations of nucleotide sequences, which allow for accurate molecular phenotype prediction even in low-data settings. We show that the developed models can be fine-tuned at low cost and despite low available data regime to solve a variety of genomics applications. Despite no supervision, the transformer models learned to focus attention on key genomic elements, including those that regulate gene expression, such as enhancers. Lastly, we demonstrate that utilizing model rep-resentations can improve the prioritization of functional genetic variants. The training and ap-plication of foundational models in genomics explored in this study provide a widely applicable stepping stone to bridge the gap of accurate molecular phenotype prediction from DNA sequence. Code and weights available at: https://github.com/instadeepai/nucleotide-transformer in Jax and https://huggingface.co/InstaDeepAI in Pytorch. Example notebooks to apply these models to any downstream task are available on HuggingFace.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
自然背包完成签到,获得积分10
刚刚
打打应助宛海采纳,获得10
1秒前
2秒前
晨熙发布了新的文献求助10
3秒前
麦当劳信徒完成签到,获得积分10
4秒前
4秒前
wying完成签到,获得积分10
5秒前
5秒前
精明芷巧完成签到 ,获得积分10
7秒前
7秒前
研路漫漫完成签到,获得积分10
8秒前
wying发布了新的文献求助30
9秒前
xr发布了新的文献求助10
9秒前
煦白发布了新的文献求助10
10秒前
姜惠完成签到,获得积分10
10秒前
田様应助JacksonHe采纳,获得10
10秒前
高挑的不凡完成签到,获得积分10
10秒前
LIKUN完成签到,获得积分10
12秒前
科研通AI5应助渊思采纳,获得10
12秒前
玄梓寒完成签到 ,获得积分10
12秒前
甜蜜采波完成签到,获得积分10
14秒前
朵朵完成签到,获得积分10
16秒前
酷波er应助邵晓啸采纳,获得10
16秒前
lyn发布了新的文献求助10
16秒前
Rondab应助艺涵采纳,获得10
20秒前
21秒前
yize完成签到,获得积分10
22秒前
Boa完成签到,获得积分10
25秒前
gfbh应助人小鸭儿大采纳,获得10
25秒前
念姬发布了新的文献求助10
25秒前
25秒前
9℃完成签到 ,获得积分10
29秒前
ruiheng发布了新的文献求助10
31秒前
畅快菠萝完成签到,获得积分10
32秒前
Jasper应助猪猪hero采纳,获得10
34秒前
35秒前
篮球完成签到,获得积分10
36秒前
疯了半天完成签到,获得积分10
38秒前
安详凡发布了新的文献求助10
39秒前
39秒前
高分求助中
A new approach to the extrapolation of accelerated life test data 1000
Cognitive Neuroscience: The Biology of the Mind 1000
Technical Brochure TB 814: LPIT applications in HV gas insulated switchgear 1000
Immigrant Incorporation in East Asian Democracies 500
Nucleophilic substitution in azasydnone-modified dinitroanisoles 500
不知道标题是什么 500
A Preliminary Study on Correlation Between Independent Components of Facial Thermal Images and Subjective Assessment of Chronic Stress 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3966147
求助须知:如何正确求助?哪些是违规求助? 3511567
关于积分的说明 11158912
捐赠科研通 3246169
什么是DOI,文献DOI怎么找? 1793309
邀请新用户注册赠送积分活动 874321
科研通“疑难数据库(出版商)”最低求助积分说明 804343