计算机科学
人工智能
机器学习
多任务学习
特征学习
可扩展性
竞争性学习
稳健性(进化)
深度学习
任务(项目管理)
生物化学
数据库
基因
经济
化学
管理
作者
Quang Pham,Chenghao Liu,Steven C. H. Hoi
标识
DOI:10.1109/tpami.2023.3324203
摘要
According to the Complementary Learning Systems (CLS) theory (McClelland et al. 1995) in neuroscience, humans do effective continual learning through two complementary systems: a fast learning system centered on the hippocampus for rapid learning of the specifics, individual experiences; and a slow learning system located in the neocortex for the gradual acquisition of structured knowledge about the environment. Motivated by this theory, we propose DualNets (for Dual Networks), a general continual learning framework comprising a fast learning system for supervised learning of pattern-separated representation from specific tasks and a slow learning system for representation learning of task-agnostic general representation via Self-Supervised Learning (SSL). DualNets can seamlessly incorporate both representation types into a holistic framework to facilitate better continual learning in deep neural networks. Via extensive experiments, we demonstrate the promising results of DualNets on a wide range of continual learning protocols, ranging from the standard offline, task-aware setting to the challenging online, task-free scenario. Notably, on the CTrL (Veniat et al. 2020) benchmark that has unrelated tasks with vastly different visual images, DualNets can achieve competitive performance with existing state-of-the-art dynamic architecture strategies (Ostapenko et al. 2021). Furthermore, we conduct comprehensive ablation studies to validate DualNets efficacy, robustness, and scalability.
科研通智能强力驱动
Strongly Powered by AbleSci AI