判别式
特征(语言学)
域适应
样品(材料)
空格(标点符号)
领域(数学分析)
人工智能
弹丸
计算机科学
适应(眼睛)
样品空间
特征向量
模式识别(心理学)
机器学习
数学
心理学
物理
材料科学
数学分析
哲学
语言学
神经科学
分类器(UML)
冶金
热力学
操作系统
作者
Rashindrie Perera,Saman Halgamuge
出处
期刊:Cornell University - arXiv
日期:2024-03-07
被引量:1
标识
DOI:10.48550/arxiv.2403.04492
摘要
In this paper, we look at cross-domain few-shot classification which presents the challenging task of learning new classes in unseen domains with few labelled examples. Existing methods, though somewhat effective, encounter several limitations, which we address in this work through two significant improvements. First, to address overfitting associated with fine-tuning a large number of parameters on small datasets, we introduce a lightweight parameter-efficient adaptation strategy. This strategy employs a linear transformation of pre-trained features, significantly reducing the trainable parameter count. Second, we replace the traditional nearest centroid classifier with a variance-aware loss function, enhancing the model's sensitivity to the inter- and intra-class variances within the training set for improved clustering in feature space. Empirical evaluations on the Meta-Dataset benchmark showcase that our approach not only improves accuracy up to 7.7% and 5.3% on seen and unseen datasets respectively but also achieves this performance while being at least ~3x more parameter-efficient than existing methods, establishing a new state-of-the-art in cross-domain few-shot learning. Our code can be found at https://github.com/rashindrie/DIPA.
科研通智能强力驱动
Strongly Powered by AbleSci AI