ABSTRACT
Traditional GUI applications provide limited support for tangible interaction, as most applications are not programmed to support tangible input, and most input devices do not provide haptic feedback. To address this limitation, we introduce GUI Robots, a software framework that enables developers to repurpose off-the-shelf robots as tangible input and haptic output devices, and to connect them to unmodified desktop applications. We introduce the GUI Robots framework and present several proof-of-concept applications, including a haptic scroll wheel, force feedback game controllers, a 3D mouse, and a self-driving notification robot. To evaluate whether GUI Robots can be used to prototype tangible interfaces for existing applications, we conducted a user study in which developers created customized tangible interfaces for two applications. Study participants were able to create tangible user interfaces for these applications in less than an hour. GUI Robots allows developers to easily extend applications with tangible input and haptic output.
Supplemental Material
- Motoyuki Akamatsu, and Sigeru Sato. 1994. A multimodal mouse with tactile and force feedback. International Journal of Human-Computer Studies, 40(3), 443--453. Google ScholarDigital Library
- Javier Alonso-Mora, Andreas Breitenmoser, Martin Rufli, Roland Siegwart, and Paul Beardsley. 2012. Image and animation display with multiple mobile robots. International Journal of Robotics Research, 31(6), 753--773. Google ScholarDigital Library
- Daniel Avrahami, Jacob O. Wobbrock, and Shahram Izadi. 2011. Portico: tangible interaction on and around a tablet. In Proceedings of the 24th Annual ACM symposium on User interface software and technology (UIST '11). ACM, New York, NY, USA, 347--356. Google ScholarDigital Library
- Ravin Balakrishnan, Thomas Baudel, Gordon Kurtenbach, and George Fitzmaurice. 1997. The Rockin'Mouse: integral 3D manipulation on a plane. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI '97). ACM, New York, NY, USA, 311--318. Google ScholarDigital Library
- Aaron Bangor, Philip T. Kortum, and James T. Miller. 2008. An empirical evaluation of the system usability scale. International Journal of Human--Computer Interaction, 24(6), 574--594.Google ScholarCross Ref
- Christoph Bartneck, Marius Soucy, Kevin Fleuret, and Eduardo B. Sandoval. 2015. The robot engine? Making the unity 3D game engine work for HRI. In Proceedings of the 24th IEEE International Symposium on Robot and Human Interactive Communication (ROMAN), Kobe, 2015, 431--437.Google Scholar
- Olivier Bau, Ivan Poupyrev, Ali Israr, and Chris Harrison. 2010. TeslaTouch: electrovibration for touch surfaces. In Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology (UIST '10). ACM, New York, NY, USA, 283--292. Google ScholarDigital Library
- John Brooke. 1996. SUS-A quick and dirty usability scale. Usability Evaluation in Industry, 189(194), 4--7.Google Scholar
- Morgan Dixon and James Fogarty. 2010. Prefab: implementing advanced behaviors using pixel-based reverse engineering of interface structure. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). ACM, New York, NY, USA, 1525--1534. Google ScholarDigital Library
- George Fitzmaurice, Justin Matejka, Igor Mordatch, Azam Khan, and Gordon Kurtenbach. 2008. Safe 3D navigation. In Proceedings of the 2008 Symposium on Interactive 3D Graphics and Games (I3D '08). ACM, New York, NY, USA, 7--15. Google ScholarDigital Library
- Sean Follmer, Daniel Leithinger, Alex Olwal, Akimitsu Hogge, and Hiroshi Ishii. 2013. inFORM: dynamic physical affordances and constraints through shape and object actuation. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST '13). ACM, New York, NY, USA, 417--426. Google ScholarDigital Library
- Antonio Gomes, Calvin Rubens, Sean Braley, and Roel Vertegaal. 2016. BitDrones: towards using 3D nanocopter displays as interactive self-levitating programmable matter. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 770--780. Google ScholarDigital Library
- Hiroshi Ishii and Brygg Ullmer. 1997. Tangible bits: towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI '97). ACM, New York, NY, USA, 234--241. Google ScholarDigital Library
- Mattias Jacobsson, Ylva Fernaeus, and Lars Erik Holmquist. 2008. Glowbots: designing and implementing engaging human-robot interaction. Journal of Physical Agents, 2(2), pp.51--60.Google Scholar
- Shaun K. Kane, Daniel Avrahami, Jacob O. Wobbrock, Beverly Harrison, Adam D. Rea, Matthai Philipose, and Anthony LaMarca. 2009. Bonfire: a nomadic system for hybrid laptop-tabletop interaction. In Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology (UIST '09). ACM, New York, NY, USA, 129--138. Google ScholarDigital Library
- Jun Kato, Daisuke Sakamoto, and Takeo Igarashi. 2012. Phybots: a toolkit for making robotic things. In Proceedings of the Designing Interactive Systems Conference (DIS '12). ACM, New York, NY, USA, 248--257. Google ScholarDigital Library
- Mathieu Le Goc, Lawrence H. Kim, Ali Parsaei, JeanDaniel Fekete, Pierre Dragicevic, and Sean Follmer. 2016. Zooids: building blocks for swarm user interfaces. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16). ACM, New York, NY, USA, 97--109. Google ScholarDigital Library
- Pedro Lopes, Patrik Jonell, and Patrick Baudisch. 2015. Affordance++: allowing objects to communicate dynamic use. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 2515--2524. Google ScholarDigital Library
- Joseph Luk, Jerome Pasquero, Shannon Little, Karon MacLean, Vincent Levesque, and Vincent Hayward. 2006. A role for haptics in mobile interaction: initial design using a handheld tactile display prototype. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '06), Rebecca Grinter, Thomas Rodden, Paul Aoki, Ed Cutrell, Robin Jeffries, and Gary Olson (Eds.). ACM, New York, NY, USA, 171--180. Google ScholarDigital Library
- Timothy Miller and Robert Zeleznik. 1998. An insidious haptic invasion: adding force feedback to the X desktop. In Proceedings of the 11th Annual ACM Symposium on User Interface Software and Technology (UIST '98). ACM, New York, NY, USA, 59--64. Google ScholarDigital Library
- Joe Mullenbach, Craig Shultz, Anne Marie Piper, Michael Peshkin, and J. Edward Colgate. 2013. Surface haptic interactions with a TPad tablet. In Proceedings of the Adjunct Publication of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST '13 Adjunct). ACM, New York, NY, USA, 7--8. Google ScholarDigital Library
- Diana Nowacka, Karim Ladha, Nils Y. Hammerla, Daniel Jackson, Cassim Ladha, Enrico Rukzio, and Patrick Olivier. 2013. TouchBugs: actuated tangibles on multi-touch tables. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 759--762. Google ScholarDigital Library
- Gian Pangaro, Dan Maynes-Aminzade, and Hiroshi Ishii. 2002. The Actuated Workbench: computer controlled actuation in tabletop tangible interfaces. In Proceedings of the 15th Annual ACM Symposium on User interface software and technology (UIST '02). ACM, New York, NY, USA, 181--190. Google ScholarDigital Library
- Esben W. Pedersen and Kasper Hornbæk. 2011. Tangible Bots: interaction with active tangibles in tabletop interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 2975--2984. Google ScholarDigital Library
- Holger Regenbrecht, Gregory Baratoff, and Michael Wagner. 2001. A tangible AR desktop environment. Computers & Graphics, 25(5), 755--763.Google ScholarCross Ref
- David Robert, Ryan Wistorrt, Jesse Gray, and Cynthia Breazeal. 2010. Exploring mixed reality robot gaming. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction (TEI '11). ACM, New York, NY, USA, 125--128. Google ScholarDigital Library
- Ivan E. Sutherland. 1965. The ultimate display. Multimedia: From Wagner to virtual reality.Google Scholar
- Brygg Ullmer and Hiroshi Ishii. 1997. The metaDESK: models and prototypes for tangible user interfaces. In Proceedings of the 10th Annual ACM Symposium on User Interface Software and Technology (UIST '97). ACM, New York, NY, USA, 223--232. Google ScholarDigital Library
- Malte Weiss, Julie Wagner, Yvonne Jansen, Roger Jennings, Ramsin Khoshabeh, James D. Hollan, and Jan Borchers. 2009. SLAP Widgets: bridging the gap between virtual and physical controls on tabletops. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 481--490. Google ScholarDigital Library
- Malte Weiss, Florian Schwarz, Simon Jakubowski, and Jan Borchers. 2010. Madgets: actuating widgets on interactive tabletops. In Proceedings of the 23nd Annual ACM Symposium on User interface Software and Technology (UIST '10). ACM, New York, NY, USA, 293--302. Google ScholarDigital Library
- Tom Yeh, Tsung-Hsiang Chang, and Robert C. Miller. 2009. Sikuli: using GUI screenshots for search and automation. In Proceedings of the 22nd annual ACM Symposium on User Interface Software and Technology (UIST '09). ACM, New York, NY, USA, 183--192. Google ScholarDigital Library
Index Terms
- GUI Robots: Using Off-the-Shelf Robots as Tangible Input and Output Devices for Unmodified GUI Applications
Recommendations
Touch and toys: new techniques for interaction with a remote group of robots
CHI '09: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsInteraction with a remote team of robots in real time is a difficult human-robot interaction (HRI) problem exacerbated by the complications of unpredictable real-world environments, with solutions often resorting to a larger-than-desirable ratio of ...
Three dimensional tangible user interface for controlling a robotic team
HRI '08: Proceedings of the 3rd ACM/IEEE international conference on Human robot interactionWe describe a new method for controlling a group of robots in three-dimensional (3D) space using a tangible user interface called the 3D Tractus. Our interface maps the task space into an interactive 3D space, allowing a single user to intuitively ...
Multi-touch interaction for tasking robots
HRI '10: Proceedings of the 5th ACM/IEEE international conference on Human-robot interactionThe objective is to develop a mobile human-robot interface that is optimized for multi-touch input. Our existing interface was designed for mouse and keyboard input and was later adopted for voice and touch interaction. A new multi-touch interface ...
Comments