Filters








3,350 Hits in 2.8 sec

Overcoming catastrophic forgetting in neural networks [article]

James Kirkpatrick, Razvan Pascanu, Neil Rabinowitz, Joel Veness, Guillaume Desjardins, Andrei A. Rusu, Kieran Milan, John Quan, Tiago Ramalho, Agnieszka Grabska-Barwinska, Demis Hassabis, Claudia Clopath, Dharshan Kumaran, Raia Hadsell
2017 arXiv   pre-print
Neural networks are not, in general, capable of this and it has been widely thought that catastrophic forgetting is an inevitable feature of connectionist models.  ...  We show that it is possible to overcome this limitation and train networks that can maintain expertise on tasks which they have not experienced for a long time.  ...  without catastrophic forgetting.  ... 
arXiv:1612.00796v2 fatcat:urg6iqrtqnemzf2mfhvgqwtizq

Overcoming catastrophic forgetting in neural networks

James Kirkpatrick, Razvan Pascanu, Neil Rabinowitz, Joel Veness, Guillaume Desjardins, Andrei A. Rusu, Kieran Milan, John Quan, Tiago Ramalho, Agnieszka Grabska-Barwinska, Demis Hassabis, Claudia Clopath (+2 others)
2017 Proceedings of the National Academy of Sciences of the United States of America  
Until now neural networks have not been capable of this and it has been widely thought that catastrophic forgetting is an inevitable feature of connectionist models.  ...  This phenomenon, termed catastrophic forgetting (2-6), occurs specifically when the network is trained sequentially on multiple tasks because the weights in the network that are important for task A are  ...  catastrophic forgetting.  ... 
doi:10.1073/pnas.1611835114 pmid:28292907 pmcid:PMC5380101 fatcat:ycc27dlo3bbvtei6rdylnjbo6a

Natural Way to Overcome the Catastrophic Forgetting in Neural Networks [article]

Alexey Kutalev
2020 arXiv   pre-print
Not so long ago, a method was discovered that successfully overcomes the catastrophic forgetting of neural networks.  ...  In this paper, we would like to propose an alternative method of overcoming catastrophic forgetting based on the total absolute signal passing through each connection in the network.  ...  The method which is following this logic have been proposed recently to overcome the problem of catastrophic forgetting in NN [5] .  ... 
arXiv:2005.07107v1 fatcat:mlumdd3vhvcjhoymuhxbfzvzje

Revisiting Parameter Reuse to Overcome Catastrophic Forgetting in Neural Networks [article]

Yuqing Zhao, Divya Saxena, Jiannong Cao
2022 arXiv   pre-print
Neural networks tend to forget previously learned knowledge when continuously learning on datasets with varying distributions, a phenomenon known as catastrophic forgetting.  ...  In this paper, we propose a new adaptive learning method, named AdaptCL, that fully reuses and grows on learned parameters to overcome catastrophic forgetting and allows the positive backward transfer  ...  However, neural networks are prone to catastrophic forgetting when dealing with non-IID data. Adapting to the new environment will cause the neural network to forget old knowledge.  ... 
arXiv:2207.11005v2 fatcat:wtasnrtmkfhrloif4awtb4l25y

SeNA-CNN: Overcoming Catastrophic Forgetting in Convolutional Neural Networks by Selective Network Augmentation [article]

Abel S. Zacarias, Luís A. Alexandre
2018 arXiv   pre-print
In this paper we present a method to overcome catastrophic forgetting on convolutional neural networks, that learns new tasks and preserves the performance on old tasks without accessing the data of the  ...  Results also showed that in some situations it is better to use SeNA-CNN instead of training a neural network using isolated learning.  ...  Once again these results shows the ability to overcome the catastrophic forgetting problem in convolutional neural networks by selectively network augmentation.  ... 
arXiv:1802.08250v2 fatcat:hdp57pgg7fdoznlgnrf3dkmlay

Unsupervised learning to overcome catastrophic forgetting in neural networks

Irene Munoz-Martin, Stefano Bianchi, Giacomo Pedretti, Octavian Melnic, Stefano Ambrogio, Daniele Ielmini
2019 IEEE Journal on Exploratory Solid-State Computational Devices and Circuits  
Achieving continual learning in artificial intelligence (AI) is currently prevented by catastrophic forgetting, where training of a new task deletes all previously learned tasks.  ...  INDEX TERMS Catastrophic forgetting, continual learning, convolutional neural network (CNN), neuromorphic engineering, phase-change memory (PCM), spike-timing-dependent plasticity (STDP), supervised learning  ...  CONCLUSION We present a novel neural network, capable of overcoming catastrophic forgetting by combining supervised and unsupervised learning.  ... 
doi:10.1109/jxcdc.2019.2911135 fatcat:jmog3mmwqvg7dp7ilolrvdkrwm

Overcoming Catastrophic Forgetting in Graph Neural Networks with Experience Replay [article]

Fan Zhou, Chengtai Cao
2021 arXiv   pre-print
Towards that, we explore the Continual Graph Learning (CGL) paradigm and present the Experience Replay based framework ER-GNN for CGL to alleviate the catastrophic forgetting problem in existing GNNs.  ...  Graph Neural Networks (GNNs) have recently received significant research attention due to their superior performance on a variety of graph-related learning tasks.  ...  Catastrophic forgetting is a direct outcome of a more general problem in neural networks, the socalled "stability-plasticity" dilemma [Grossberg, 2012] .  ... 
arXiv:2003.09908v2 fatcat:yby3urpyxncdbfmwag4ffbublm

Natural Way to Overcome the Catastrophic Forgetting in Neural Networks

Alexey Kutalev
2020
Not so long ago, a method was discovered that successfully overcomes the catastrophic forgetting in neural networks.  ...  In this paper, we would like to propose an alternative method of overcoming catastrophic forgetting based on the total absolute signal passing through each connection in the network.  ...  The method which is following this logic have been proposed recently to overcome the problem of catastrophic forgetting in NN [5] .  ... 
doi:10.48550/arxiv.2005.07107 fatcat:cubbdqymtjhkndi2jx3nnntzdu

Weight Friction: A Simple Method to Overcome Catastrophic Forgetting and Enable Continual Learning [article]

Gabrielle K. Liu
2019 arXiv   pre-print
In this research, we propose a simple method to overcome catastrophic forgetting and enable continual learning in neural networks.  ...  In recent years, deep neural networks have found success in replicating human-level cognitive skills, yet they suffer from several major obstacles.  ...  Conclusions In this research, we addressed the problem of catastrophic forgetting in neural networks, which hinders continual learning.  ... 
arXiv:1908.01052v2 fatcat:sv6wtizqyjbn3nytmurnvqysua

Overcoming Long-term Catastrophic Forgetting through Adversarial Neural Pruning and Synaptic Consolidation [article]

Jian Peng, Bo Tang, Hao Jiang, Zhuo Li, Yinjie Lei, Tao Lin, Haifeng Li
2021 IEEE Transactions on Neural Networks and Learning Systems   accepted
Artificial neural networks face the well-known problem of catastrophic forgetting.  ...  is used to overcome the long-term catastrophic forgetting issue.  ...  CONCLUSIONS AND FUTURE WORKS Long-term catastrophic forgetting limits the application of neural networks in practice.  ... 
doi:10.1109/tnnls.2021.3056201 pmid:33577459 arXiv:1912.09091v2 fatcat:glic2itroraa7jpicjaamjljsu

Attention-Based Structural-Plasticity [article]

Soheil Kolouri, Nicholas Ketz, Xinyun Zou, Jeffrey Krichmar, Praveen Pilly
2019 arXiv   pre-print
Neural networks, in particular, suffer plenty from the catastrophic forgetting phenomenon. Recently there has been several efforts towards overcoming catastrophic forgetting in neural networks.  ...  Here, we propose a biologically inspired method toward overcoming catastrophic forgetting.  ...  Relevant work In order to overcome catastrophic forgetting three general strategies are reported in the literature: 1. selective synaptic plasticity to protect consolidated knowledge, 2. additional neural  ... 
arXiv:1903.06070v1 fatcat:hhlw2mi44zcijprvuetnvnobr4

Sliced Cramer Synaptic Consolidation for Preserving Deeply Learned Representations

Soheil Kolouri, Nicholas A. Ketz, Andrea Soltoggio, Praveen K. Pilly
2020 International Conference on Learning Representations  
Deep neural networks suffer from the inability to preserve the learned data representation (i.e., catastrophic forgetting) in domains where the input data distribution is non-stationary, and it changes  ...  architectures in both supervised and unsupervised learning settings.  ...  The existing computational models, for instance deep convolutional neural networks (CNNs), face two fundamental issues regarding incremental learning, 1) catastrophic forgetting (McCloskey & Cohen, 1989  ... 
dblp:conf/iclr/KolouriKSP20 fatcat:ku3hiiyenfdr5c52f7wp7sixpq

The stability-plasticity dilemma: investigating the continuum from catastrophic forgetting to age-limited learning effects

Martial Mermillod, Aurélia Bugaiska, Patrick Bonin
2013 Frontiers in Psychology  
The basic idea is that learning in a parallel and distributed system requires plasticity for the integration of new knowledge, but also stability in order to prevent the forgetting of previous knowledge  ...  However, for the most part, neural computation has addressed the problems related to excessive plasticity or excessive stability as two different fields in the literature.  ...  THE PROBLEM OF CATASTROPHIC FORGETTING FOR DISTRIBUTED NEURAL NETWORKS The problem of catastrophic forgetting has emerged as one of the main problems facing artificial neural networks.  ... 
doi:10.3389/fpsyg.2013.00504 pmid:23935590 pmcid:PMC3732997 fatcat:2marhb6nlrhile7vq37ftt2lh4

Overcoming Catastrophic Forgetting by Soft Parameter Pruning [article]

Jian Peng, Jiang Hao, Zhuo Li, Enqiang Guo, Xiaohong Wan, Deng Min, Qing Zhu, Haifeng Li
2018 arXiv   pre-print
Catastrophic forgetting is a challenge issue in continual learning when a deep neural network forgets the knowledge acquired from the former task after learning on subsequent tasks.  ...  On the other hand, It also leads to "long-term" memory issue when the network capacity is limited since adding tasks will "eat" the network capacity.  ...  Dual memory network drew on this idea partially to overcome catastrophic forgetting by an external network.  ... 
arXiv:1812.01640v1 fatcat:sdi2skmqczhidd56uomwfzbfxy

Technical Report for ICCV 2021 Challenge SSLAD-Track3B: Transformers Are Better Continual Learners [article]

Duo Li, Guimei Cao, Yunlu Xu, Zhanzhan Cheng, Yi Niu
2022 arXiv   pre-print
We find that transformers suffer less from catastrophic forgetting compared to convolutional neural network.  ...  In this report, we first introduce the overall framework of continual learning for object detection. Then, we analyse the key elements' effect on withstanding catastrophic forgetting in our solution.  ...  To overcome catastrophic forgetting in artificial neural networks, three types of methods have been explored in the past few years, replay [15, 3, 6, 20, 10] , regularization [9, 19, 8, 4, 17, 13, 5,  ... 
arXiv:2201.04924v1 fatcat:lalu5sa5szgdrewy2ebyjkmagq
« Previous Showing results 1 — 15 out of 3,350 results