遗忘
计算机科学
人工智能
水准点(测量)
自编码
任务(项目管理)
机器学习
多任务学习
深度学习
卷积神经网络
集合(抽象数据类型)
工程类
哲学
语言学
程序设计语言
大地测量学
系统工程
地理
作者
Reem Mahmoud,Hazem Hajj
出处
期刊:ACM Transactions on Knowledge Discovery From Data
[Association for Computing Machinery]
日期:2022-07-30
卷期号:16 (6): 1-20
被引量:2
摘要
One key objective of artificial intelligence involves the continuous adaptation of machine learning models to new tasks. This branch of continual learning is also referred to as lifelong learning (LL), where a major challenge is to minimize catastrophic forgetting, or forgetting previously learned tasks. While previous work on catastrophic forgetting has been focused on vision problems; this work targets time-series data. In addition to choosing an architecture appropriate for time-series sequences, our work addresses limitations in previous work, including the handling of distribution shifts in class labels. We present multi-objective learning with three loss functions to minimize catastrophic forgetting, prediction error, and errors in generalizing across label shifts, simultaneously. We build a multi-task autoencoder network with a hierarchical convolutional recurrent architecture. The proposed method is capable of learning multiple time-series tasks simultaneously. For cases where the model needs to learn multiple new tasks, we propose sequential learning, starting with tasks that have the best individual performances. This solution was evaluated on four benchmark human activity recognition datasets collected from mobile sensing devices. A wide set of baseline comparisons is performed, and an ablation analysis is run to evaluate the impact of the different losses in the proposed multi-objective method. The results demonstrate an up to 4% performance improvement in catastrophic forgetting compared to the use of loss functions in state-of-the-art solutions while demonstrating minimal losses compared to upper bound methods of traditional fine-tuning (FT) and multi-task learning (MTL).
科研通智能强力驱动
Strongly Powered by AbleSci AI