Elsevier

Computers & Graphics

Volume 55, April 2016, Pages 108-117
Computers & Graphics

Technical Section
Large-scale painting of photographs by interactive optimization

https://doi.org/10.1016/j.cag.2015.11.001Get rights and content

Highlights

  • An interactive and light-weight system that allows to paint large-scale murals.

  • An analytical model designed and calibrated to accurately simulate spray paint.

  • An inexpensive actuation device and light-weight tracking system.

  • The pipeline was validated with physically realized and simulated examples.

Abstract

We propose a system for painting large-scale murals of arbitrary input photographs. To that end, we choose spray paint, which is easy to use and affordable, yet requires skill to create interesting murals. An untrained user simply waves a programmatically actuated spray can in front of the canvas. Our system tracks the can׳s position and determines the optimal amount of paint to disperse to best approximate the input image. We accurately calibrate our spray paint simulation model in a pre-process and devise optimization routines for run-time paint dispersal decisions. Our setup is light-weight: it includes two webcams and QR-coded cubes for tracking, and a small actuation device for the spray can, attached via a 3D-printed mount. The system performs at haptic rates, which allows the user – informed by a visualization of the image residual – to guide the system interactively to recover low frequency features. We validate our pipeline for a variety of grayscale and color input images and present results in simulation and physically realized murals.

Introduction

Spray paint is affordable and easy to use. As a result, large-scale spray paint murals are ubiquitous and take a prominent place in modern culture (see [1] for many examples). Spray painters may cover large “canvases”, such as walls of buildings, with minimal scaffolding hassle, and the diffusive spray allows spatially graded color mixing on the fly. However, manual creation of interesting spray paintings is currently restricted to skilled artists. In addition, the large scale of the painting, compared to the close-range spraying (from distances of 10–40 cm) makes it challenging to orient and precisely position oneself for accurate spraying, forcing the artist to keep a global vision while focusing on local changes.

Though traditional (e.g. inkjet) printing on large-format paper is possible, it requires access to expensive non-standard equipment. Further, depending on the target surface, it may be impossible to attach paper or canvas. Paint provides a practical alternative, and decidedly induces a certain aesthetic character. Naïve solutions to assisted spray painting, such as procedural dithering or half-toning, are tedious and do not take advantage of the painter׳s perceptual expertise: A human can easily tell which important areas of an image need further detail. Non-interactive systems inherently lack this ability, potentially wasting precious painting time on unimportant regions. Stenciling is another obvious candidate, but it necessitates quantization to solid colors and may require many topologically complex stencils. Large-scale murals would also require cumbersome large-scale stencils. Finally, stencils do not necessarily inherit any aesthetics particular to spray painting: the same paintings could be made with brushes or rollers.

Our solution is a “smart” spray can. From a high level, rather than spraying a solid color, our can sprays a photograph (see Fig. 2). Our system tracks the position and orientation of the spray can held by the user, who may be regarded as a cheap alternative to a robotic arm. Our optimization then determines on-the-fly how much paint to spray, or, more precisely, how long to spray, and issues appropriate commands to an actuating device attached to the spray can (see Fig. 1). By simultaneously simulating the spraying process, we visualize a residual image that indicates to the user the locations on the mural that could benefit from more painting (see Fig. 3). We also monitor the potential residual as well as the potential benefit for the current spray can color. These properties are respectively the maximum amount of error that can possibly be reduced by adding more paint of the current color, and the expected effect of adding more of the current color. When little progress can be made with the current color, the user is prompted to switch color, and the process is repeated until satisfaction.

We demonstrate the effectiveness of this process for a variety of input images. We present physically realized paintings, as well as simulated results. The physical murals validate that our simulation matches reality and show that our model captures the image content while preserving some of the spray paint aesthetic.

Section snippets

Historical perspective and related work

Computer-aided painting is an old and well-studied subject among both scientists and artists. Artist Desmond Paul Henry unveiled his Henry Drawing Machine in 1962. This machine created physical realizations of procedurally generated drawings. One year later, Ivan Sutherland׳s famous SKETCHPAD pioneered interactive virtual interfaces for drawing and modeling. Now, the modern frontier of research in computer-aided painting is more specialized and spans a variety of interfaces and applications.

The

Method

The user will stand before a canvas (e.g. wall or sheet of paper) and wave a programmatically actuated spray can equipped with a wireless receiver. Running on a nearby computer, our real-time algorithm determines the optimal amount of paint of the current color to spray at the spray can׳s tracked location. Our run-time system can be broken down into four parts: (1) physically actuating the spray can, (2) spray can tracking, (3) simulating the spray process, and (4) optimizing the amount of

Results

We have validated our system by spray painting a set of photographs, as shown in Fig. 9, Fig. 10, Fig. 11, Fig. 12 and the accompanying video. Table 1 presents some statistics of our experiments.

A typical painting session begins by registering the cameras with respect to the wall. Then the user can directly start spraying the chosen input image. An extract of a typical painting session is shown in Fig. 3. A monitor showing the current potential residual helps the user determine which part of

Discussion and future work

Analogous to the “sculpting by numbers” approach of Rivers et al. [11], we do not aim to train the user to become a skilled, unassisted spray painter, nor are we expecting to reach the quality of professional artists. Instead, our system provides the basic technology to spray paint an input image. Without it, a novice would only produce a rough abstraction of the image, especially for the scale we target. However, our current system does not offer a very creative user experience and

Conclusion

We presented an interactive system and an online spray painting simulation algorithm, enabling novice users to paint large-scale murals of arbitrary input photographs. Our system aids the user in tasks that are difficult for humans, especially when lacking artistic training and experience: it automatically tracks the position of the spray can relative to the mural and makes decisions regarding the amount of paint to spray, based on an online simulation of the spraying process. We devise a

Acknowledgments

The authors would like to thank Gilles Caprari for his help in developing the prototype version of the device, Maurizio Nitti for the concept art he created, and the Computer Science department of ETH Zurich for lending us a painting workspace. We also thank our colleagues from DRZ, IGL and CGL for insightful discussions and early user testing.

References (40)

  • T. Lindemeier et al.

    Image stylization with a painting machine using semantic hints

    Comput Graph

    (2013)
  • Ganz N. Graffiti world: street art from five continents. Harry N Abrams, New York, NY (USA);...
  • Deussen O, Lindemeier T, Pirk S, Tautzenberger M. Feedback-guided stroke placement for a painting machine. In:...
  • Yao F, Shao G. Painting brush control techniques in Chinese painting robot. In: Proceedings of IEEE international...
  • Tresset PA, Leymarie FF. Sketches by Paul the robot. In: Proceedings of CAe. ISBN 978-1-4503-1584-5, 2012. p....
  • Lehni U, Hektor. In a beautiful place out in the country. In: Wenn Roboter Zeichnen. Kunstmuseum Solothurn;...
  • Flagg M, Rehg JM. Projector-guided painting. In: Proceedings of UIST. ISBN 1-59593-313-1, 2006. p....
  • Laviole J, Hachet M. Spatial augmented reality to enhance physical artistic creation. In: Adjunct Proceedings of UIST....
  • Iarussi E, Bousseau A, Tsandilas T. The drawing assistant: automated drawing guidance and feedback from photographs....
  • A. Rivers et al.

    Position-correcting tools for 2d digital fabrication

    ACM Trans Graph

    (2012)
  • A. Rivers et al.

    Sculpting by numbers

    ACM Trans Graph

    (2012)
  • Zoran A, Shilkrot R, Paradiso J. Human–computer interaction for hybrid carving. In: Proceedings of UIST. ISBN...
  • H. Yoshida et al.

    Architecture-scale human-assisted additive manufacturing

    ACM Trans Graph

    (2015)
  • Shilkrot R, Maes P, Zoran A. Physical painting with a digital airbrush. In: SIGGRAPH emerging technologies,...
  • R. Shilkrot et al.

    Augmented airbrush for computer aided painting (cap)

    ACM Trans Graph

    (2015)
  • Haeberli P. Paint by numbers: abstract image representations. In: Proceedings of ACM SIGGRAPH, 1990. p....
  • Hertzmann A. Painterly rendering with curved brush strokes of multiple sizes. In: Proceedings of ACM SIGGRAPH,1998. p....
  • K. Zeng et al.

    From image parsing to painterly rendering

    ACM Trans Graph

    (2009)
  • Baxter B, Scheib V, Lin MC, Manocha D. DAB: interactive haptic painting with 3D virtual brushes. In: Proceedings of ACM...
  • Baxter W, Wendt J, Lin MC. IMPaSTo: a realistic, interactive model for paint. In: Proceedings of NPAR, 2004. p....
  • Cited by (11)

    • Advanced tone rendition technique for a painting robot

      2019, Robotics and Autonomous Systems
      Citation Excerpt :

      Rapidly developing digital technologies: image processing, artificial intelligence, and robotics, — are brought together in a number of applications, one of which is visual art, experiencing a notable shift into digital medium [1]. Various interactive systems for visual art have been developed up to date, e.g., a mobile system for brushstroke rendering [2] or an interactive spray painting system, allowing the user to create large-scale murals with a spray can the cap of which is pressed automatically with respect to the can position tracked by a computer [3]. Future direction in art implies enhancement of machine creativity, and typical “creative” tasks include generating visually aesthetic images and converting existing photographs into artistic paintings [4].

    • A digital assistant for shading paper sketches

      2020, Visual Computing for Industry, Biomedicine, and Art
    • Mobile robotic painting of texture

      2019, Proceedings - IEEE International Conference on Robotics and Automation
    View all citing articles on Scopus
    View full text