Continual learning with attentive recurrent neural networks for temporal data classification

遗忘 计算机科学 人工智能 任务(项目管理) 循环神经网络 机器学习 人工神经网络 深度学习 认知心理学 心理学 经济 管理
作者
Shao-Yu Yin,Yu Huang,Tien-Yu Chang,Shih-Fang Chang,Vincent S. Tseng
出处
期刊:Neural Networks [Elsevier BV]
卷期号:158: 171-187 被引量:11
标识
DOI:10.1016/j.neunet.2022.10.031
摘要

Continual learning is an emerging research branch of deep learning, which aims to learn a model for a series of tasks continually without forgetting knowledge obtained from previous tasks. Despite receiving a lot of attention in the research community, temporal-based continual learning techniques are still underutilized. In this paper, we address the problem of temporal-based continual learning by allowing a model to continuously learn on temporal data. To solve the catastrophic forgetting problem of learning temporal data in task incremental scenarios, in this research, we propose a novel method based on attentive recurrent neural networks, called Temporal Teacher Distillation (TTD). TTD solves the catastrophic forgetting problem in an attentive recurrent neural network based on three hypotheses, namely Rotation Hypothesis, Redundant Hypothesis, and Recover Hypothesis. Rotation Hypothesis and Redundant hypotheses could cause the attention shift phenomenon, which degrades the model performance on the learned tasks. Moreover, not considering the Recover Hypothesis increases extra memory usage in continuously training different tasks. Therefore, the proposed TTD based on the above hypotheses complements the inadequacy of the existing methods for temporal-based continual learning. For evaluating the performance of our proposed method in task incremental setting, we use a public dataset, WIreless Sensor Data Mining (WISDM), and a synthetic dataset, Split-QuickDraw-100. According to experimental results, the proposed TTD significantly outperforms state-of-the-art methods by up to 14.6% and 45.1% in terms of accuracy and forgetting measures, respectively. To the best of our knowledge, this is the first work that studies continual learning in real-world incremental categories for temporal data classification with attentive recurrent neural networks and provides the proper application-oriented scenario.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
3秒前
李爱国应助王文豪采纳,获得10
3秒前
Emily完成签到,获得积分20
4秒前
替我活着发布了新的文献求助10
4秒前
4秒前
5秒前
士心发布了新的文献求助30
5秒前
6秒前
8秒前
吃猫的鱼发布了新的文献求助10
8秒前
9秒前
无花果应助hyh采纳,获得10
9秒前
Meng发布了新的文献求助10
10秒前
今天只做一件事应助blenx采纳,获得10
10秒前
FloppyWow发布了新的文献求助10
10秒前
10秒前
11秒前
顾矜应助粗心的chen采纳,获得10
12秒前
zhaoming完成签到 ,获得积分10
13秒前
852应助123采纳,获得10
13秒前
缘一发布了新的文献求助10
13秒前
杨枝甘露发布了新的文献求助10
13秒前
威武豌豆发布了新的文献求助10
13秒前
zhc发布了新的文献求助10
14秒前
烟花应助龟蒙真人采纳,获得10
14秒前
MRM发布了新的文献求助10
14秒前
fff完成签到 ,获得积分10
14秒前
徐欣然完成签到 ,获得积分10
14秒前
ziyuexu发布了新的文献求助20
16秒前
18秒前
拜托你清醒一点完成签到,获得积分10
18秒前
foreverchoi完成签到,获得积分10
18秒前
XHT完成签到,获得积分10
19秒前
huba完成签到,获得积分10
19秒前
Rqbnicsp完成签到,获得积分10
20秒前
玩命的毛衣完成签到 ,获得积分10
20秒前
20秒前
超级白昼发布了新的文献求助10
21秒前
Jasper应助威武豌豆采纳,获得20
21秒前
传奇3应助ziyuexu采纳,获得10
22秒前
高分求助中
Production Logging: Theoretical and Interpretive Elements 2700
Ophthalmic Equipment Market 1500
Neuromuscular and Electrodiagnostic Medicine Board Review 1000
こんなに痛いのにどうして「なんでもない」と医者にいわれてしまうのでしょうか 510
いちばんやさしい生化学 500
The First Nuclear Era: The Life and Times of a Technological Fixer 500
Unusual formation of 4-diazo-3-nitriminopyrazoles upon acid nitration of pyrazolo[3,4-d][1,2,3]triazoles 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3672461
求助须知:如何正确求助?哪些是违规求助? 3228752
关于积分的说明 9781866
捐赠科研通 2939164
什么是DOI,文献DOI怎么找? 1610648
邀请新用户注册赠送积分活动 760696
科研通“疑难数据库(出版商)”最低求助积分说明 736174