Shape Synthesis from Sketches via Procedural Models and Convolutional Networks

Haibin Huang, Evangelos Kalogerakis, Ersin Yumer, Radomir Mech
2017 IEEE Transactions on Visualization and Computer Graphics  
Procedural modeling techniques can produce high quality visual content through complex rule sets. However, controlling the outputs of these techniques for design purposes is often notoriously difficult for users due to the large number of parameters involved in these rule sets and also their non-linear relationship to the resulting content. To circumvent this problem, we present a sketch-based approach to procedural modeling. Given an approximate and abstract hand-drawn 2D sketch provided by a
more » ... ser, our algorithm automatically computes a set of procedural model parameters, which in turn yield multiple, detailed output shapes that resemble the user's input sketch. The user can then select an output shape, or further modify the sketch to explore alternative ones. At the heart of our approach is a deep Convolutional Neural Network (CNN) that is trained to map sketches to procedural model parameters. The network is trained by large amounts of automatically generated synthetic line drawings. By using an intuitive medium i.e., freehand sketching as input, users are set free from manually adjusting procedural model parameters, yet they are still able to create high quality content. We demonstrate the accuracy and efficacy of our method in a variety of procedural modeling scenarios including design of man-made and organic shapes. Index Terms-shape synthesis, convolutional neural networks, procedural modeling, sketch-based modeling !
doi:10.1109/tvcg.2016.2597830 pmid:27514042 fatcat:5hprwqvwy5agdkylun4yzq67ny