A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Mnemonics Training: Multi-Class Incremental Learning without Forgetting
[article]
2020
arXiv
pre-print
Multi-Class Incremental Learning (MCIL) aims to learn new concepts by incrementally updating a model trained on previous concepts. ...
However, there is an inherent trade-off to effectively learning new concepts without catastrophic forgetting of previous ones. ...
Conclusions In this paper, we develop a novel mnemonics training framework for tackling multi-class incremental learning tasks. ...
arXiv:2002.10211v4
fatcat:zz4vaugqxreo5hl7phs46qhu5i
Mnemonical Body Shortcuts: Gestural Interface for Mobile Devices
[article]
2014
arXiv
pre-print
We present a body space based approach to improve mobile device interaction and mobile performance, which we named as Mnemonical Body Shortcuts. ...
Preliminary studies using Radio Frequency Identification (RFID) technology were performed, validating Mnemonical Body Shortcuts as an appropriate new mobile interaction mechanism. ...
Default Mnemonical Body Shortcuts As stated in the fourth chapter, we also made available 12 default Mnemonical Body Shortcuts, which can be used without any training. ...
arXiv:1402.1296v1
fatcat:vqqgbaumqvhlljs4vj6tgp6lzi
A Class-Incremental Learning Method Based on Preserving the Learned Feature Space for EEG-Based Emotion Recognition
2022
Mathematics
In this paper, we propose a Class-Incremental Learning (CIL) method, named Incremental Learning preserving the Learned Feature Space (IL2FS), in order to enable deep learning models to incorporate new ...
classes. ...
Forgetting (LwF) [49] , Incremental Classifier and Representation Learning (iCARL) [32] , Mnemonics [53] , ScaIL [51] , Weighting Aligning (WA) [36] , and Geodesic+LUCIR [25] . ...
doi:10.3390/math10040598
fatcat:ykcoaupcpjcnljo7tsjtqeypvy
DILF-EN framework for Class-Incremental Learning
[article]
2021
arXiv
pre-print
Deep learning models suffer from catastrophic forgetting of the classes in the older phases as they get trained on the classes introduced in the new phase in the class-incremental learning setting. ...
Therefore, we also propose a novel dual-incremental learning framework that involves jointly training the network with two incremental learning objectives, i.e., the class-incremental learning objective ...
As evident from the training process of DILF, our proposed DILF combines multi-task learning, self-supervision, and data augmentation for training the model. ...
arXiv:2112.12385v1
fatcat:zzyud7ac65hvlk3tv6fyw4tnte
SAR Target Incremental Recognition Based on Hybrid Loss Function and Class-Bias Correction
2022
Applied Sciences
There are still three problems in the existing incremental learning methods: (1) the recognition performance of old target classes degrades significantly during the incremental process; (2) the target ...
Incremental learning emerges to continuously obtain new knowledge from new data while preserving most previously learned knowledge, saving both time and storage. ...
Old Sample Preservation Multi-class incremental learning without forgetting (mnemonics training) [33] is a method based on old sample preservation, and it is similar to [8, 12, 28, 29] . ...
doi:10.3390/app12031279
fatcat:vcdcob5w3vg7raa6yvviabwp5i
Essentials for Class Incremental Learning
[article]
2021
arXiv
pre-print
In this work, we shed light on the causes of this well-known yet unsolved phenomenon - often referred to as catastrophic forgetting - in a class-incremental setup. ...
Moreover, we identify poor quality of the learned representation as another reason for catastrophic forgetting in class-IL. ...
Mnemonics training: Multi-class in-
cremental learning without forgetting. In IEEE Con- [29] Max Welling. Herding dynamical weights to learn. ...
arXiv:2102.09517v1
fatcat:li3tvwjwanfbnn52dx6lhi42im
GP-Tree: A Gaussian Process Classifier for Few-Shot Incremental Learning
[article]
2021
arXiv
pre-print
We demonstrate the effectiveness of our method against other Gaussian process training baselines, and we show how our general GP approach achieves improved accuracy on standard incremental few-shot learning ...
Here, we propose GP-Tree, a novel method for multi-class classification with Gaussian processes and DKL. ...
Mnemonics training: Multi-class incremental learning without forgetting. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 12245-12254, 2020b. ...
arXiv:2102.07868v4
fatcat:2bkw62rrmzdwbiwbrqo74mnjg4
Improving Vision Transformers for Incremental Learning
[article]
2022
arXiv
pre-print
This paper proposes a working recipe of using Vision Transformer (ViT) in class incremental learning. ...
Finally, our solution, named ViTIL (ViT for Incremental Learning) achieves new state-of-the-art on both CIFAR and ImageNet datasets for all three class incremental learning setups by a clear margin. ...
In Learning without Forgetting (LwF) [24] , knowledge distillation is applied to address catastrophic forgetting under multi-task incremental learning scenario. ...
arXiv:2112.06103v3
fatcat:xdpvownvvngobitmqy44ca6jbm
Continual Learning Based on OOD Detection and Task Masking
[article]
2022
arXiv
pre-print
Existing continual learning techniques focus on either task incremental learning (TIL) or class incremental learning (CIL) problem, but not both. ...
The key novelty is that each task is trained as an OOD detection model rather than a traditional supervised learning model, and a task mask is trained to protect each task to prevent forgetting. ...
- learning • iCaRL: https://github.com/yaoyao-liu/class-incremental- learning • Mnemonics: https://github.com/yaoyao-liu/class- incremental-learning
Some parameters from different tasks can be shared ...
arXiv:2203.09450v1
fatcat:sjrfhcnx6jgh7kq33ufh47fhle
Hypothesis-driven Online Video Stream Learning with Augmented Memory
[article]
2021
arXiv
pre-print
Given a lack of online incremental class learning datasets on video streams, we introduce and adapt two additional video datasets, Toybox and iLab, for online stream learning. ...
The ability to continuously acquire new knowledge without forgetting previous tasks remains a challenging problem for computer vision systems. ...
Lower bound is trained sequentially over all tasks without any measures to avoid catastrophic forgetting. ...
arXiv:2104.02206v4
fatcat:2yi3jeo32ndnno2cs6m7tr76ka
PLOP: Learning without Forgetting for Continual Semantic Segmentation
[article]
2021
arXiv
pre-print
However, continual learning methods are usually prone to catastrophic forgetting. ...
Furthermore, we design an entropy-based pseudo-labelling of the background w.r.t. classes predicted by the old model to deal with background shift and avoid catastrophic forgetting of the old classes. ...
Mnemonics training: Multi-class incremen- [61] Anthony Robins. Catastrophic forgetting, rehearsal and
tal learning without forgetting. ...
arXiv:2011.11390v3
fatcat:y4jvvsq26ncwtb7vloxxw5rupq
A Comprehensive Study of Class Incremental Learning Algorithms for Visual Tasks
[article]
2020
arXiv
pre-print
and analyze them according to these properties, (2) introduce a unified formalization of the class-incremental learning problem, (3) propose a common evaluation framework which is more thorough than existing ...
evidence that it is possible to obtain competitive performance without the use of knowledge distillation to tackle catastrophic forgetting and (6) facilitate reproducibility by integrating all tested ...
Learning without Forgetting ( ) [36] is a pioneering work that does not require a memory of past classes. ...
arXiv:2011.01844v4
fatcat:kw43joxgwzgsfmmi3su2pjfwv4
Class-incremental learning: survey and performance evaluation on image classification
[article]
2021
arXiv
pre-print
Recently, we have seen a shift towards class-incremental learning where the learner must discriminate at inference time between all classes seen in previous tasks without recourse to a task-ID. ...
The main challenge for incremental learning is catastrophic forgetting, which refers to the precipitous drop in performance on previously learned tasks after learning a new one. ...
1 We do not refer to the scenario where each task only contains a single class, but consider adding a group of classes for each task. ...
arXiv:2010.15277v2
fatcat:wacloedzxrea3dgcuwm7xknyxe
Generalized and Incremental Few-Shot Learning by Explicit Learning and Calibration without Forgetting
[article]
2021
arXiv
pre-print
Both generalized and incremental few-shot learning have to deal with three major challenges: learning novel classes from only few samples per class, preventing catastrophic forgetting of base classes, ...
While the first phase learns base classes with many samples, the second phase learns a calibrated classifier for novel classes from few samples while also preventing catastrophic forgetting. ...
Mnemonics training: Multi-class incremental learning
ized multi–task learning. In Proceedings of the tenth ACM without forgetting. ...
arXiv:2108.08165v1
fatcat:jcwmbksz3vd7feu2unhrtib45q
IB-DRR: Incremental Learning with Information-Back Discrete Representation Replay
[article]
2021
arXiv
pre-print
Incremental learning aims to enable machine learning models to continuously acquire new knowledge given new classes, while maintaining the knowledge already learned for old classes. ...
However, finding a trade-off between the model performance and the number of samples to save for each class is still an open problem for replay-based incremental learning and is increasingly desirable ...
Our experiments on CIFAR-100 [20] showed that Discrete Representation Replay (DRR) outperformed the state-of-the-art replay-based method [26] in the multi-class incremental learning setting by a margin ...
arXiv:2104.10588v1
fatcat:ykcoj3yhofhr7n4kfp6rhimgky
« Previous
Showing results 1 — 15 out of 902 results