Multi-scale cross-attention transformer via graph embeddings for few-shot molecular property prediction

计算机科学 嵌入 分子图 图形 财产(哲学) 变压器 理论计算机科学 机器学习 人工智能 特征学习 数据挖掘 量子力学 认识论 物理 哲学 电压
作者
Luis H.M. Torres,Bernardete Ribeiro,Joel P. Arrais
出处
期刊:Applied Soft Computing [Elsevier]
卷期号:153: 111268-111268 被引量:8
标识
DOI:10.1016/j.asoc.2024.111268
摘要

Molecular property prediction is a critical step in drug discovery. Deep learning (DL) has accelerated the discovery of compounds with desirable molecular properties for successful drug development. However, molecular property prediction is a low-data problem which makes it hard to solve by regular DL approaches. Graph neural networks (GNNs) operate on graph-structured data using neighborhood aggregation to facilitate the prediction of molecular properties. Nonetheless, GNNs struggle to model the global-semantic structure of graph embeddings for molecular property prediction. Recently, Transformer networks have emerged to model such long-range interactions of molecular embeddings at different scales to predict downstream molecular property tasks. Yet, extending this behavior to molecular embeddings and enabling its training on small biological datasets remains an important challenge in drug discovery. In this work, we study how to learn multi-scale representations from comprehensive graph embeddings for molecular property prediction. To this end, we propose a few-shot GNN-Transformer architecture to combine graph embedding tokens of different sizes and produce stronger features for representation learning. A multi-scale Transformer applies a cross-attention mechanism to exchange information of deep representations fused across two separate branches for small and large embeddings. In addition, a two-module meta-learning framework iteratively updates model parameters across tasks to predict new molecular properties on few-shot data. Extensive experiments on multi-property prediction datasets demonstrate the superior performance of the proposed model when compared with other standard graph-based methods. The code and data underlying this article are available in the repository: https://github.com/ltorres97/FS-CrossTR.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
体贴太英发布了新的文献求助10
1秒前
1秒前
2秒前
2秒前
张自燮发布了新的文献求助10
2秒前
seal完成签到,获得积分10
3秒前
赵媛完成签到,获得积分20
3秒前
光亮的向南完成签到,获得积分10
3秒前
3秒前
FENGHUI完成签到,获得积分10
4秒前
4秒前
香蕉觅云应助紫色哀伤采纳,获得10
4秒前
NexusExplorer应助ZwB采纳,获得10
5秒前
5秒前
念慈发布了新的文献求助10
5秒前
自觉的书蝶完成签到,获得积分10
6秒前
自信的坤发布了新的文献求助10
6秒前
6秒前
赵媛发布了新的文献求助10
6秒前
23完成签到 ,获得积分10
6秒前
量子星尘发布了新的文献求助10
6秒前
8秒前
一个屁桃完成签到,获得积分10
8秒前
9秒前
renshi647发布了新的文献求助10
9秒前
徐硕完成签到,获得积分10
10秒前
12秒前
科研通AI6应助Tomasong采纳,获得10
12秒前
隐形曼青应助zhuhuaipu采纳,获得10
12秒前
liran12319完成签到,获得积分10
12秒前
13秒前
Kinkin完成签到,获得积分10
14秒前
小玲子完成签到,获得积分10
14秒前
茉莉完成签到,获得积分10
14秒前
lay完成签到,获得积分10
15秒前
小二郎应助体贴太英采纳,获得10
15秒前
15秒前
16秒前
16秒前
Benjamin完成签到 ,获得积分10
16秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Binary Alloy Phase Diagrams, 2nd Edition 8000
Comprehensive Methanol Science Production, Applications, and Emerging Technologies 2000
From Victimization to Aggression 1000
Translanguaging in Action in English-Medium Classrooms: A Resource Book for Teachers 700
Exosomes Pipeline Insight, 2025 500
Red Book: 2024–2027 Report of the Committee on Infectious Diseases 500
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5653156
求助须知:如何正确求助?哪些是违规求助? 4789346
关于积分的说明 15062969
捐赠科研通 4811762
什么是DOI,文献DOI怎么找? 2574063
邀请新用户注册赠送积分活动 1529786
关于科研通互助平台的介绍 1488445