Coherent Online Video Style Transfer [article]

Dongdong Chen and Jing Liao and Lu Yuan and Nenghai Yu and Gang Hua
2017 arXiv   pre-print
Training a feed-forward network for fast neural style transfer of images is proven to be successful. However, the naive extension to process video frame by frame is prone to producing flickering results. We propose the first end-to-end network for online video style transfer, which generates temporally coherent stylized video sequences in near real-time. Two key ideas include an efficient network by incorporating short-term coherence, and propagating short-term coherence to long-term, which
more » ... res the consistency over larger period of time. Our network can incorporate different image stylization networks. We show that the proposed method clearly outperforms the per-frame baseline both qualitatively and quantitatively. Moreover, it can achieve visually comparable coherence to optimization-based video style transfer, but is three orders of magnitudes faster in runtime.
arXiv:1703.09211v2 fatcat:jjmqeuruenfmhmu55krjcrs7ie