Interactive Sketching of Urban Procedural Models

3D modeling remains a notoriously difficult task for novices despite significant research effort to provide intuitive and automated systems. We tackle this problem by combining the strengths of two popular domains: sketch-based modeling and procedural modeling. On the one hand, sketch-based modeling exploits our ability to draw but requires detailed, unambiguous drawings to achieve complex models. On the other hand, procedural modeling automates the creation of precise and detailed geometry but requires the tedious definition and parameterization of procedural models. Our system uses a collection of simple procedural grammars, called snippets, as building blocks to turn sketches into realistic 3D models. We use a machine learning approach to solve the inverse problem of finding the procedural model that best explains a user sketch. We use non-photorealistic rendering to generate artificial data for training convolutional neural networks capable of quickly recognizing the procedural rule intended by a sketch and estimating its parameters. We integrate our algorithm in a coarse-to-fine urban modeling system that allows users to create rich buildings by successively sketching the building mass, roof, facades, windows, and ornaments. A user study shows that by using our approach non-expert users can generate complex buildings in just a few minutes.

Images and movies

See also

See how we modeled a variety of real-world buildings using our system in supplemental material. Download our slides from our SIGGRAPH talk. See also the project webpage.

BibTex references

@Article{NGGBB16, author = "Nishida, Gen and Garcia-Dorado, Ignacio and G. Aliaga, Daniel and Benes, Bedrich and Bousseau, Adrien", title = "Interactive Sketching of Urban Procedural Models", journal = "ACM Transactions on Graphics (SIGGRAPH Conference Proceedings)", year = "2016", url = "http://www-sop.inria.fr/reves/Basilic/2016/NGGBB16" }

Other publications in the database