A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
M2KD: Incremental Learning via Multi-model and Multi-level Knowledge Distillation
2020
British Machine Vision Conference
Incremental learning targets at achieving good performance on new categories without forgetting old ones. Knowledge distillation has been shown critical in preserving the performance on old classes. Conventional methods, however, sequentially distill knowledge only from the penultimate model, leading to performance degradation on the old classes in later incremental learning steps. In this paper, we propose a multi-model and multi-level knowledge distillation strategy. Instead of sequentially
dblp:conf/bmvc/ZhouMZXWD20
fatcat:sww5q3b4svde7amlc7ebfqsdqy