DTITR: End-to-end drug–target binding affinity prediction with transformers

计算机科学 变压器 生物信息学 背景(考古学) 判别式 药物发现 人工智能 编码器 计算生物学 机器学习 化学 生物信息学 工程类 生物 古生物学 生物化学 电压 电气工程 基因 操作系统
作者
Nelson R. C. Monteiro,José Luís Oliveira,Joel P. Arrais
出处
期刊:Computers in Biology and Medicine [Elsevier]
卷期号:147: 105772-105772 被引量:8
标识
DOI:10.1016/j.compbiomed.2022.105772
摘要

The accurate identification of Drug-Target Interactions (DTIs) remains a critical turning point in drug discovery and understanding of the binding process. Despite recent advances in computational solutions to overcome the challenges of in vitro and in vivo experiments, most of the proposed in silico-based methods still focus on binary classification, overlooking the importance of characterizing DTIs with unbiased binding strength values to properly distinguish primary interactions from those with off-targets. Moreover, several of these methods usually simplify the entire interaction mechanism, neglecting the joint contribution of the individual units of each binding component and the interacting substructures involved, and have yet to focus on more explainable and interpretable architectures. In this study, we propose an end-to-end Transformer-based architecture for predicting drug-target binding affinity (DTA) using 1D raw sequential and structural data to represent the proteins and compounds. This architecture exploits self-attention layers to capture the biological and chemical context of the proteins and compounds, respectively, and cross-attention layers to exchange information and capture the pharmacological context of the DTIs. The results show that the proposed architecture is effective in predicting DTA, achieving superior performance in both correctly predicting the value of interaction strength and being able to correctly discriminate the rank order of binding strength compared to state-of-the-art baselines. The combination of multiple Transformer-Encoders was found to result in robust and discriminative aggregate representations of the proteins and compounds for binding affinity prediction, in which the addition of a Cross-Attention Transformer-Encoder was identified as an important block for improving the discriminative power of these representations. Overall, this research study validates the applicability of an end-to-end Transformer-based architecture in the context of drug discovery, capable of self-providing different levels of potential DTI and prediction understanding due to the nature of the attention blocks. The data and source code used in this study are available at: https://github.com/larngroup/DTITR.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
危机发布了新的文献求助20
2秒前
希望天下0贩的0应助hyl采纳,获得10
2秒前
Hello应助潇洒馒头采纳,获得10
2秒前
内向士萧发布了新的文献求助10
2秒前
文静的幻嫣完成签到,获得积分20
2秒前
科研通AI2S应助认真子默采纳,获得10
2秒前
打打应助酸菜萌萌鱼采纳,获得10
3秒前
Lu发布了新的文献求助10
3秒前
勤恳的嚓茶完成签到,获得积分10
4秒前
刻苦沛芹发布了新的文献求助30
4秒前
5秒前
5秒前
5秒前
小马甲应助syh采纳,获得10
5秒前
科研小白发布了新的文献求助10
6秒前
7秒前
尔尔完成签到,获得积分20
7秒前
莫羽倾尘完成签到,获得积分10
7秒前
汉堡包应助哈哈采纳,获得10
7秒前
高挑的不凡完成签到,获得积分10
8秒前
酷酷问夏完成签到 ,获得积分10
8秒前
jjx1005完成签到 ,获得积分10
8秒前
李健的小迷弟应助雨霁采纳,获得20
8秒前
loong完成签到 ,获得积分10
9秒前
10秒前
欢喜完成签到 ,获得积分10
10秒前
飘逸晓山发布了新的文献求助10
10秒前
李雪完成签到,获得积分10
10秒前
小甜恬完成签到 ,获得积分10
10秒前
赘婿应助百草采纳,获得10
10秒前
10秒前
大面包发布了新的文献求助10
11秒前
11秒前
昕昕发布了新的文献求助10
11秒前
12秒前
科目三应助IAMXC采纳,获得200
12秒前
linliqing完成签到,获得积分10
12秒前
明理向露完成签到,获得积分10
13秒前
11完成签到,获得积分10
13秒前
jasmine完成签到,获得积分10
14秒前
高分求助中
Evolution 10000
The Young builders of New china : the visit of the delegation of the WFDY to the Chinese People's Republic 1000
юрские динозавры восточного забайкалья 800
English Wealden Fossils 700
Foreign Policy of the French Second Empire: A Bibliography 500
Chen Hansheng: China’s Last Romantic Revolutionary 500
China's Relations With Japan 1945-83: The Role of Liao Chengzhi 400
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3147491
求助须知:如何正确求助?哪些是违规求助? 2798710
关于积分的说明 7830633
捐赠科研通 2455455
什么是DOI,文献DOI怎么找? 1306817
科研通“疑难数据库(出版商)”最低求助积分说明 627917
版权声明 601587