Towards Human Brain Inspired Lifelong Learning, pp. 9-23 (2024) No AccessChapter 2: Architectural Approaches to Continual LearningHaytham M. Fayek and Hong Ren WuHaytham M. FayekSchool of Computing Technologies, Royal Melbourne Institute of Technology (RMIT) University, Melbourne, Australia and Hong Ren WuSchool of Computing Technologies, Royal Melbourne Institute of Technology (RMIT) University, Melbourne, Australiahttps://doi.org/10.1142/9789811286711_0002Cited by:0 (Source: Crossref) PreviousNext AboutSectionsPDF/EPUB ToolsAdd to favoritesDownload CitationsTrack CitationsRecommend to Library ShareShare onFacebookTwitterLinked InRedditEmail Abstract: Continual learning is the ability of a learning system to solve new tasks by utilizing previously acquired knowledge from learning and performing prior tasks without having significant adverse effects on the acquired prior knowledge. Numerous approaches were developed to achieve this ability while avoiding the well-known problem of catastrophic forgetting in neural networks. This chapter reviews a number of architectural approaches to continual learning in neural networks which tackle the problem by modifying the architecture of the neural network, for example, by adding new adaptive parameters to the model for each new task. The architectural paradigm for continual learning in neural networks can potentially completely eliminate the problem of catastrophic forgetting and maintain competitive performance, often at the expense of increased computational complexity. FiguresReferencesRelatedDetails Recommended Towards Human Brain Inspired Lifelong LearningMetrics History PDF download