生成语法
生成模型
贝叶斯概率
序列(生物学)
计算机科学
人工智能
强化学习
贝叶斯定理
统计物理学
基础(线性代数)
实验数据
生物系统
算法
化学
机器学习
数学
物理
生物化学
几何学
统计
生物
作者
Oufan Zhang,Mojtaba Haghighatlari,Jie Li,Zi Hao Liu,Ashley Namini,João M. C. Teixeira,Julie D. Forman‐Kay,Teresa Head‐Gordon
摘要
We have developed a Generative Recurrent Neural Networks (GRNN) that learns the probability of the next residue torsions $X_{i+1}=\ [\phi_{i+1},\psi_{i+1},\omega _{i+1}, \chi_{i+1}]$ from the previous residue in the sequence $X_i$ to generate new IDP conformations. In addition, we couple the GRNN with a Bayesian model, X-EISD, in a reinforcement learning step that biases the probability distributions of torsions to take advantage of experimental data types such as J-couplingss, NOEs and PREs. We show that updating the generative model parameters according to the reward feedback on the basis of the agreement between structures and data improves upon existing approaches that simply reweight static structural pools for disordered proteins. Instead the GRNN "DynamICE" model learns to physically change the conformations of the underlying pool to those that better agree with experiment.
科研通智能强力驱动
Strongly Powered by AbleSci AI