Mitigation of catastrophic forgetting in recurrent neural networks using a Fixed Expansion Layer

Robert Coop, Itamar Arel
2013 The 2013 International Joint Conference on Neural Networks (IJCNN)  
Catastrophic forgetting (or catastrophic interference) in supervised learning systems is the drastic loss of previously stored information caused by the learning of new information. While substantial work has been published on addressing catastrophic forgetting in memoryless supervised learning systems (e.g. feedforward neural networks), the problem has received limited attention in the context of dynamic systems, particularly recurrent neural networks. In this paper, we introduce a solution
more » ... mitigating catastrophic forgetting in RNNs based on enhancing the Fixed Expansion Layer (FEL) neural network which exploits sparse coding of hidden neuron activations. Simulation results on several non-stationary data sets clearly demonstrate the effectiveness of the proposed architecture.
doi:10.1109/ijcnn.2013.6707047 dblp:conf/ijcnn/CoopA13 fatcat:re6wkzuy2jgwlbgr2s5c4o5tru