Learning Neurosymbolic Generative Models via Program Synthesis [article]

Halley Young and Osbert Bastani and Mayur Naik
2019 arXiv   pre-print
Significant strides have been made toward designing better generative models in recent years. Despite this progress, however, state-of-the-art approaches are still largely unable to capture complex global structure in data. For example, images of buildings typically contain spatial patterns such as windows repeating at regular intervals; state-of-the-art generative methods can't easily reproduce these structures. We propose to address this problem by incorporating programs representing global
more » ... ructure into the generative model---e.g., a 2D for-loop may represent a configuration of windows. Furthermore, we propose a framework for learning these models by leveraging program synthesis to generate training data. On both synthetic and real-world data, we demonstrate that our approach is substantially better than the state-of-the-art at both generating and completing images that contain global structure.
arXiv:1901.08565v1 fatcat:kz773zueh5colixcoiua4lidji