连接器
化学
肽
分子动力学
酰化
分子力学
生物信息学
跨膜结构域
疏水效应
立体化学
生物物理学
氨基酸
生物化学
计算化学
生物
基因
操作系统
催化作用
计算机科学
作者
Tine Maja Frimann,Suk Kyu Ko,Pernille Harris,Jens Thostrup Bukrinski,Günther H.J. Peters
标识
DOI:10.1080/07391102.2022.2078409
摘要
We have performed a series of multiple molecular dynamics (MD) simulations of glucagon-like peptide-1 (GLP-1) and acylated GLP-1 analogues in complex with the endogenous receptor (GLP-1R) to obtain a molecular understanding of how fatty acid (FA) chain structure, acylation position on the peptide, and presence of a linker affect the binding. MD simulations were analysed to extract heatmaps of receptor-peptide interaction patterns and to determine the free energy of binding using the molecular mechanics Poisson-Boltzmann surface area (MM-PBSA) approach. The extracted free energies from MM-PBSA calculations are in qualitative agreement with experimentally determined potencies. Furthermore, the interaction patterns seen in the receptor-GLP-1 complex simulations resemble previously reported binding interactions validating the simulations. Analysing the receptor-GLP-1 analogue complex simulations, we found that the major differences between the systems stem from FA interactions and positioning of acylation in the peptide. Hydrophobic interactions between the FA chain and a hydrophobic patch on the extracellular domain contribute significantly to the binding affinity. Acylation on Lys26 resulted in noticeably more interactions between the FA chain and the extracellular domain hydrophobic patch than found for acylation on Lys34 and Lys38, respectively. The presence of a charged linker between the peptide and FA chain can potentially stabilise the complex by forming hydrogen bonds to arginine residues in the linker region between the extracellular domain and the transmembrane domain. A molecular understanding of the fatty acid structure and its effect on binding provides important insights into designing acylated agonists for GLP-1R.Communicated by Ramaswamy H. Sarma.
科研通智能强力驱动
Strongly Powered by AbleSci AI