Self-Supervised GAN to Counter Forgetting [article]

Ting Chen and Xiaohua Zhai and Neil Houlsby
2018 arXiv   pre-print
GANs involve training two networks in an adversarial game, where each network's task depends on its adversary. Recently, several works have framed GAN training as an online or continual learning problem. We focus on the discriminator, which must perform classification under an (adversarially) shifting data distribution. When trained on sequential tasks, neural networks exhibit forgetting. For GANs, discriminator forgetting leads to training instability. To counter forgetting, we encourage the
more » ... scriminator to maintain useful representations by adding a self-supervision. Conditional GANs have a similar effect using labels. However, our self-supervised GAN does not require labels, and closes the performance gap between conditional and unconditional models. We show that, in doing so, the self-supervised discriminator learns better representations than regular GANs.
arXiv:1810.11598v2 fatcat:4ulxhadrnbdgrp7adx3urrwski