计算机科学
人工智能
图形
混合模型
学习迁移
机器学习
高斯分布
普遍性(动力系统)
人工神经网络
理论计算机科学
模式识别(心理学)
数据挖掘
物理
量子力学
作者
Baiyan Zhang,Hefei Ling,Jialie Shen,Qian Wang,Jie Lei,Yuxuan Shi,Lei Wu,Ping Li
出处
期刊:IEEE Transactions on Cognitive and Developmental Systems
[Institute of Electrical and Electronics Engineers]
日期:2021-04-23
卷期号:14 (3): 892-901
被引量:9
标识
DOI:10.1109/tcds.2021.3075280
摘要
Few-shot learning aims at heuristically resolving new tasks with limited labeled data; most of the existing approaches are affected by knowledge learned from similar experiences. However, interclass barriers and new samples insufficiency limit the transfer of knowledge. In this article, we propose a novel mixture distribution graph network, in which the interclass relation is explicitly modeled and propagated via graph generation. Owing to the weighted distribution features based on the Gaussian mixture model, we take class diversity into consideration, thereby utilizing information precisely and efficiently. Equipped with minimal gated units, the "memory" of similar tasks can be preserved and reused through episode training, which fills a gap in temporal characteristics and softens the impact of data insufficiency. Extensive trials are carried out based on the MiniImageNet and CIFAR-FS data sets. The results turn out that our method exceeds most state-of-the-art approaches, which shows the validity and universality of our method in few-shot learning.
科研通智能强力驱动
Strongly Powered by AbleSci AI