计算机科学
序列(生物学)
计算生物学
心理学
语言学
自然语言处理
生物
遗传学
哲学
作者
Richard W. Shuai,Jeffrey A. Ruffolo,Jeffrey J. Gray
出处
期刊:Cell systems
[Elsevier]
日期:2023-10-30
卷期号:14 (11): 979-989.e4
被引量:31
标识
DOI:10.1016/j.cels.2023.10.001
摘要
Discovery and optimization of monoclonal antibodies for therapeutic applications relies on large sequence libraries but is hindered by developability issues such as low solubility, high aggregation, and high immunogenicity. Generative language models, trained on millions of protein sequences, are a powerful tool for the on-demand generation of realistic, diverse sequences. We present the Immunoglobulin Language Model (IgLM), a deep generative language model for creating synthetic antibody libraries. Compared with prior methods that leverage unidirectional context for sequence generation, IgLM formulates antibody design based on text-infilling in natural language, allowing it to re-design variable-length spans within antibody sequences using bidirectional context. We trained IgLM on 558 million (M) antibody heavy- and light-chain variable sequences, conditioning on each sequence's chain type and species of origin. We demonstrate that IgLM can generate full-length antibody sequences from a variety of species and its infilling formulation allows it to generate infilled complementarity-determining region (CDR) loop libraries with improved in silico developability profiles. A record of this paper's transparent peer review process is included in the supplemental information.
科研通智能强力驱动
Strongly Powered by AbleSci AI