计算机科学
机器学习
过度拟合
杠杆(统计)
人工智能
图形
数据挖掘
理论计算机科学
人工神经网络
作者
Yonghao Liu,Lan Huang,Bowen Cao,Ximing Li,Fausto Giunchiglia,Xiaoyue Feng,Renchu Guan
标识
DOI:10.1145/3589334.3645587
摘要
Graphs, as a fundamental data structure, have proven efficacy in modeling complex relationships between objects and are therefore found in wide web applications. Graph classification is an essential task in graph data analysis, which can effectively assist in extracting information and mining content from the web. Recently, few-shot graph classification, a more realistic and challenging task, has garnered great research interest. Existing few-shot graph classification models are all supervised, assuming abundant labeled data in base classes for meta-training. However, sufficient annotation is often challenging to obtain in practice due to high costs or demand for expertise. Moreover, they commonly adopt complicated meta-learning algorithms via episodic training to transfer prior knowledge from base classes. To break free from these constraints, in this paper, we propose a simple yet effective approach named SMART for unsupervised few-shot graph classification without using any labeled data. SMART employs transfer learning philosophy instead of the previously prevailing meta-learning paradigm, avoiding the need for sophisticated meta-learning algorithms. Additionally, we adopt a novel mixup strategy to augment the original graph data and leverage unsupervised pretraining on these data to obtain the expressive graph encoder. We also utilize the prompt tuning technique to alleviate the overfitting and low fine-tuning efficiency caused by the limited support samples of novel classes. Extensive experimental results demonstrate the superiority of our proposed approach, significantly surpassing even leading supervised few-shot graph classification models. Our code is available here.
科研通智能强力驱动
Strongly Powered by AbleSci AI