Interactive Sketching of Urban Procedural Models

1Purdue University
2Inria

ACM Transactions on Graphics, 2016 (Proceedings of the ACM SIGGRAPH 2016)

Abstract

3D modeling remains a notoriously difficult task for novices despite significant research effort to provide intuitive and automated systems. We tackle this problem by combining the strengths of two popular domains: sketch-based modeling and procedural modeling. On the one hand, sketch-based modeling exploits our ability to draw but requires detailed, unambiguous drawings to achieve complex models. On the other hand, procedural modeling automates the creation of precise and detailed geometry but requires the tedious definition and parameterization of procedural models. Our system uses a collection of simple procedural grammars, called snippets, as building blocks to turn sketches into realistic 3D models. We use a machine learning approach to solve the inverse problem of finding the procedural model that best explains a user sketch. We use nonphotorealistic rendering to generate artificial data for training convolutional neural networks capable of quickly recognizing the procedural rule intended by a sketch and estimating its parameters. We integrate our algorithm in a coarse-to-fine urban modeling system that allows users to create rich buildings by successively sketching the building mass, roof, facades, windows, and ornaments. A user study shows that by using our approach non-expert users can generate complex buildings in just a few minutes.

Download

BibTeX

@article{
 Nishida:2016:ISU:2897824.2925951,
 author = {Nishida, Gen and Garcia-Dorado, Ignacio and Aliaga, Daniel G. and Benes, Bedrich and Bousseau, Adrien},
 title = {Interactive Sketching of Urban Procedural Models},
 journal = {ACM Trans. Graph.},
 volume = {35},
 number = {4},
 year = {2016},
}

Acknowledgement

We would like to thank Jennifer Neville and Hogun Park for their comments and suggestions about the CNN architectures, our user study participants for their time and feedbacks, and the anonymous reviewers for their constructive suggestions. This research was partially funded by NSF CBET 1250232, NSF IIS 1302172, a Google Research Award, and the French ANR project SEMAPOLIS (ANR-13-CORD-0003).