A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2018; you can also visit the original URL.
The file type is application/pdf
.
Incremental Learning of Object Detectors without Catastrophic Forgetting
2017
2017 IEEE International Conference on Computer Vision (ICCV)
Despite their success for object detection, convolutional neural networks are ill-equipped for incremental learning, i.e., adapting the original model trained on a set of classes to additionally detect objects of new classes, in the absence of the initial training data. They suffer from "catastrophic forgetting"-an abrupt degradation of performance on the original set of classes, when the training objective is adapted to the new classes. We present a method to address this issue, and learn
doi:10.1109/iccv.2017.368
dblp:conf/iccv/ShmelkovSA17
fatcat:j24eunukrzannbd6aummmfb47y