iTAML: An Incremental Task-Agnostic Meta-learning Approach

Jathushan Rajasegaran, Salman Khan, Munawar Hayat, Fahad Shahbaz Khan, Mubarak Shah
2020 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)  
Humans can continuously learn new knowledge as their experience grows. In contrast, previous learning in deep neural networks can quickly fade out when they are trained on a new task. In this paper, we hypothesize this problem can be avoided by learning a set of generalized parameters, that are neither specific to old nor new tasks. In this pursuit, we introduce a novel meta-learning approach that seeks to maintain an equilibrium between all the encountered tasks. This is ensured by a new
more » ... pdate rule which avoids catastrophic forgetting. In comparison to previous metalearning techniques, our approach is task-agnostic. When presented with a continuum of data, our model automatically identifies the task and quickly adapts to it with just a single update. We perform extensive experiments on five datasets in a class-incremental setting, leading to significant improvements over the state of the art methods (e.g., a 21.3% boost on CIFAR100 with 10 incremental tasks). Specifically, on large-scale datasets that generally prove difficult cases for incremental learning, our approach delivers absolute gains as high as 19.1% and 7.4% on Ima-geNet and MS-Celeb datasets, respectively. Our codes are available at: https://github.com/brjathu/iTAML .
doi:10.1109/cvpr42600.2020.01360 dblp:conf/cvpr/RajasegaranKHKS20 fatcat:3n6frqycnfdfve66owocqrd3pe