The Nucleotide Transformer: Building and Evaluating Robust Foundation Models for Human Genomics

基因组学 计算生物学 计算机科学 变压器 基因组 生物 基因 遗传学 工程类 电气工程 电压
作者
Hugo Dalla-Torre,Liam Gonzalez,Javier Mendoza Revilla,Nicolás López Carranza,Adam Henryk Grywaczewski,Francesco Oteri,Christian Dallago,Evan Trop,Hassan Sirelkhatim,Guillaume Richard,Marcin J. Skwark,Karim Beguir,Marie Lopez,Thomas Pierrot
标识
DOI:10.1101/2023.01.11.523679
摘要

Abstract Closing the gap between measurable genetic information and observable traits is a longstand-ing challenge in genomics. Yet, the prediction of molecular phenotypes from DNA sequences alone remains limited and inaccurate, often driven by the scarcity of annotated data and the inability to transfer learnings between prediction tasks. Here, we present an extensive study of foundation models pre-trained on DNA sequences, named the Nucleotide Transformer, rang-ing from 50M up to 2.5B parameters and integrating information from 3,202 diverse human genomes, as well as 850 genomes selected across diverse phyla, including both model and non-model organisms. These transformer models yield transferable, context-specific representations of nucleotide sequences, which allow for accurate molecular phenotype prediction even in low-data settings. We show that the developed models can be fine-tuned at low cost and despite low available data regime to solve a variety of genomics applications. Despite no supervision, the transformer models learned to focus attention on key genomic elements, including those that regulate gene expression, such as enhancers. Lastly, we demonstrate that utilizing model rep-resentations can improve the prioritization of functional genetic variants. The training and ap-plication of foundational models in genomics explored in this study provide a widely applicable stepping stone to bridge the gap of accurate molecular phenotype prediction from DNA sequence. Code and weights available at: https://github.com/instadeepai/nucleotide-transformer in Jax and https://huggingface.co/InstaDeepAI in Pytorch. Example notebooks to apply these models to any downstream task are available on HuggingFace.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
jingyeqi完成签到,获得积分10
刚刚
刚刚
1秒前
Aaron发布了新的文献求助10
1秒前
湖北师范发布了新的文献求助10
1秒前
feeelicette发布了新的文献求助10
2秒前
2秒前
小余同学发布了新的文献求助10
3秒前
无处不在完成签到 ,获得积分10
4秒前
anan应助平常的如风采纳,获得10
4秒前
Puffkten发布了新的文献求助10
4秒前
大个应助海盐气泡水采纳,获得10
5秒前
5秒前
5秒前
龙妍琳完成签到,获得积分10
5秒前
5秒前
ame关闭了ame文献求助
6秒前
goldNAN发布了新的文献求助10
6秒前
Unlung发布了新的文献求助10
6秒前
我是老大应助适可而止采纳,获得10
6秒前
zhonglv7应助科研通管家采纳,获得10
6秒前
雪山飞发布了新的文献求助100
6秒前
6秒前
6秒前
FashionBoy应助尼i采纳,获得10
6秒前
秀秀应助科研通管家采纳,获得10
6秒前
HeAuBook应助科研通管家采纳,获得20
6秒前
浮游应助科研通管家采纳,获得10
6秒前
无花果应助科研通管家采纳,获得10
6秒前
ding应助科研通管家采纳,获得10
6秒前
科研通AI6应助科研通管家采纳,获得10
7秒前
深情安青应助科研通管家采纳,获得10
7秒前
BioZheng应助科研通管家采纳,获得10
7秒前
BioZheng应助科研通管家采纳,获得10
7秒前
7秒前
英姑应助科研通管家采纳,获得10
7秒前
8R60d8应助科研通管家采纳,获得10
7秒前
科研通AI6应助科研通管家采纳,获得10
7秒前
科研通AI6应助科研通管家采纳,获得10
8秒前
8R60d8应助科研通管家采纳,获得10
8秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Fermented Coffee Market 2000
Constitutional and Administrative Law 500
PARLOC2001: The update of loss containment data for offshore pipelines 500
Critical Thinking: Tools for Taking Charge of Your Learning and Your Life 4th Edition 500
Investigative Interviewing: Psychology and Practice 300
Atlas of Anatomy (Fifth Edition) 300
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 物理化学 基因 遗传学 催化作用 冶金 量子力学 光电子学
热门帖子
关注 科研通微信公众号,转发送积分 5287058
求助须知:如何正确求助?哪些是违规求助? 4439572
关于积分的说明 13822123
捐赠科研通 4321561
什么是DOI,文献DOI怎么找? 2372031
邀请新用户注册赠送积分活动 1367525
关于科研通互助平台的介绍 1331007