计算机科学
遗忘
等级制度
不相交集
任务(项目管理)
班级(哲学)
人工智能
类层次结构
机器学习
约束(计算机辅助设计)
简单(哲学)
程序设计语言
市场经济
工程类
经济
组合数学
认识论
管理
数学
语言学
哲学
面向对象程序设计
机械工程
作者
Byung Hyun Lee,Okchul Jung,Jonghyun Choi,Se Young Chun
标识
DOI:10.1109/iccv51070.2023.01080
摘要
Continual learning (CL) enables models to adapt to new tasks and environments without forgetting previously learned knowledge. While current CL setups have ignored the relationship between labels in the past task and the new task with or without small task overlaps, real-world scenarios often involve hierarchical relationships between old and new tasks, posing another challenge for traditional CL approaches. To address this challenge, we propose a novel multi-level hierarchical class incremental task configuration with an online learning constraint, called hierarchical label expansion (HLE). Our configuration allows a network to first learn coarse-grained classes, with data labels continually expanding to more fine-grained classes in various hierarchy depths. To tackle this new setup, we propose a rehearsal-based method that utilizes hierarchy-aware pseudo-labeling to incorporate hierarchical class information. Additionally, we propose a simple yet effective memory management and sampling strategy that selectively adopts samples of newly encountered classes. Our experiments demonstrate that our proposed method can effectively use hierarchy on our HLE setup to improve classification accuracy across all levels of hierarchies, regardless of depth and class imbalance ratio, outperforming prior state-of-the-art works by significant margins while also outperforming them on the conventional disjoint, blurry and i-Blurry CL setups.
科研通智能强力驱动
Strongly Powered by AbleSci AI