Our device and real-time optimization enables spray painting murals of any input photograph. We track the user’s movements and simulate the spraying process to choose the optimal amount of paint to spray in order to best approximate the input image.

abstract

We propose a system for painting large-scale murals of arbitrary input photographs. To that end, we choose spray paint, which is easy to use and affordable, yet requires skill to create interesting murals. An untrained user simply waves a programmatically actuated spray can in front of the canvas. Our system tracks the can’s position and determines the optimal amount of paint to disperse to best approximate the input image. We accurately calibrate our spray paint simulation model in a pre-process and devise optimization routines for run-time paint dispersal decisions. Our setup is light-weight: it includes two webcams and QR-coded cubes for tracking, and a small actuation device for the spray can, attached via a 3D-printed mount. The system performs at haptic rates, which allows the user -- informed by a visualization of the image residual -- to guide the system interactively to recover low frequency features. We validate our pipeline for a variety of grayscale and color input images and present results in simulation and physically realized murals.

downloads

Paper (Computers & Graphics, official version available at http://www.sciencedirect.com/)

Data (additional simulated results and comparisons)

Video

BibTex entry

accompanying video

blog/press articles

acknowledgments

The authors would like to thank Gilles Caprari for his help in developing the prototype version of the device, Maurizio Nitti for the concept art he created, and the Department of Computer Science at ETH Zurich for lending us a painting workspace. We also thank our colleagues from DRZ, IGL and CGL for insightful discussions and early user testing.