过度拟合
水准点(测量)
学习迁移
任务(项目管理)
特征(语言学)
一般化
机器学习
特征提取
人工神经网络
深度学习
计算机科学
模式识别(心理学)
元学习(计算机科学)
特征学习
人工智能
数学
工程类
哲学
数学分析
系统工程
地理
语言学
大地测量学
作者
Yadang Chen,Hui Yan,Zhi-Xin Yang,Enhua Wu
标识
DOI:10.1016/j.jvcir.2022.103678
摘要
Deep neural network models with strong feature extraction capacity are prone to overfitting and fail to adapt quickly to new tasks with few samples. Gradient-based meta-learning approaches can minimize overfitting and adapt to new tasks fast, but they frequently use shallow neural networks with limited feature extraction capacity. We present a simple and effective approach called Meta-Transfer-Adjustment learning (MTA) in this paper, which enables deep neural networks with powerful feature extraction capabilities to be applied to few-shot scenarios while avoiding overfitting and gaining the capacity for quickly adapting to new tasks via training on numerous tasks. Our presented approach is classified into two major parts, the Feature Adjustment (FA) module, and the Task Adjustment (TA) module. The feature adjustment module (FA) helps the model to make better use of the deep network to improve feature extraction, while the task adjustment module (TA) is utilized for further improve the model's fast response and generalization capabilities. The proposed model delivers good classification results on the benchmark small sample datasets MiniImageNet and Fewshot-CIFAR100, as proved experimentally.
科研通智能强力驱动
Strongly Powered by AbleSci AI