The Nucleotide Transformer: Building and Evaluating Robust Foundation Models for Human Genomics

基因组学 计算生物学 计算机科学 变压器 基因组 生物 基因 遗传学 工程类 电气工程 电压
作者
Hugo Dalla-Torre,Liam Gonzalez,Javier Mendoza Revilla,Nicolás López Carranza,Adam Henryk Grywaczewski,Francesco Oteri,Christian Dallago,Evan Trop,Hassan Sirelkhatim,Guillaume Richard,Marcin J. Skwark,Karim Beguir,Marie Lopez,Thomas Pierrot
标识
DOI:10.1101/2023.01.11.523679
摘要

Abstract Closing the gap between measurable genetic information and observable traits is a longstand-ing challenge in genomics. Yet, the prediction of molecular phenotypes from DNA sequences alone remains limited and inaccurate, often driven by the scarcity of annotated data and the inability to transfer learnings between prediction tasks. Here, we present an extensive study of foundation models pre-trained on DNA sequences, named the Nucleotide Transformer, rang-ing from 50M up to 2.5B parameters and integrating information from 3,202 diverse human genomes, as well as 850 genomes selected across diverse phyla, including both model and non-model organisms. These transformer models yield transferable, context-specific representations of nucleotide sequences, which allow for accurate molecular phenotype prediction even in low-data settings. We show that the developed models can be fine-tuned at low cost and despite low available data regime to solve a variety of genomics applications. Despite no supervision, the transformer models learned to focus attention on key genomic elements, including those that regulate gene expression, such as enhancers. Lastly, we demonstrate that utilizing model rep-resentations can improve the prioritization of functional genetic variants. The training and ap-plication of foundational models in genomics explored in this study provide a widely applicable stepping stone to bridge the gap of accurate molecular phenotype prediction from DNA sequence. Code and weights available at: https://github.com/instadeepai/nucleotide-transformer in Jax and https://huggingface.co/InstaDeepAI in Pytorch. Example notebooks to apply these models to any downstream task are available on HuggingFace.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
酷波er应助李勤_秦礼采纳,获得10
1秒前
1秒前
pluto应助邱名仕采纳,获得10
2秒前
如晴完成签到,获得积分10
2秒前
3秒前
劳恩特完成签到,获得积分10
4秒前
SciGPT应助vicky采纳,获得10
5秒前
5秒前
340881发布了新的文献求助10
7秒前
7秒前
HOAN应助水泥酱采纳,获得100
7秒前
量子星尘发布了新的文献求助10
7秒前
慕青应助lawang采纳,获得10
8秒前
传奇3应助lawang采纳,获得10
8秒前
希望天下0贩的0应助lawang采纳,获得10
9秒前
传奇3应助lawang采纳,获得10
9秒前
我是老大应助lawang采纳,获得10
9秒前
Orange应助zjm采纳,获得10
9秒前
星辰大海应助lawang采纳,获得10
9秒前
完美世界应助lawang采纳,获得10
9秒前
量子星尘发布了新的文献求助10
9秒前
虚幻莹发布了新的文献求助10
9秒前
10秒前
zpli完成签到 ,获得积分10
11秒前
11秒前
科研通AI6应助hhdr采纳,获得10
14秒前
14秒前
14秒前
熬夜波比应助科研通管家采纳,获得10
15秒前
霍笑寒完成签到,获得积分10
15秒前
15秒前
乐乐应助科研通管家采纳,获得10
15秒前
科研通AI6应助科研通管家采纳,获得10
15秒前
orixero应助韩明轩采纳,获得10
15秒前
SciGPT应助科研通管家采纳,获得10
15秒前
共享精神应助科研通管家采纳,获得10
15秒前
我是老大应助科研通管家采纳,获得10
15秒前
科研通AI6应助科研通管家采纳,获得10
15秒前
李健应助科研通管家采纳,获得10
15秒前
大个应助科研通管家采纳,获得10
16秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
2025-2031全球及中国金刚石触媒粉行业研究及十五五规划分析报告 6000
Real World Research, 5th Edition 680
Superabsorbent Polymers 600
Handbook of Migration, International Relations and Security in Asia 555
Between high and low : a chronology of the early Hellenistic period 500
Advanced Memory Technology: Functional Materials and Devices 500
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5675201
求助须知:如何正确求助?哪些是违规求助? 4943911
关于积分的说明 15151850
捐赠科研通 4834390
什么是DOI,文献DOI怎么找? 2589443
邀请新用户注册赠送积分活动 1543079
关于科研通互助平台的介绍 1501039