ABSTRACT
What will the universal remote control of the near future look like? What form will the next generation of human-computer interfaces take? Will they be conspicuous interfaces within the built envi- ronment, like a computer screen or a smart speaker? Will they resemble the ubiquitous, portable rectangles that we all carry in our pockets? We propose a third paradigm: interfaces that hide in plain sight, inconspicuously integrated into the furniture always al- ready around us, but ready to be called upon when needed in order to establish a user interface. Our furniture-robot prototype - tbo, the TableBot - demonstrates the viability of this furniture-based human-computer paradigm.
- Ian Gonsher, Steve Kim, McKenna Cisler, Jonathan Lister, Benjamin Navetta, Peter Haas, Ethan Mok, Horatio Han, Beth Phillips, and Maartje de Graaf. 2018. Demo Hour. Interactions 25, 4 (June 2018), 8--11. https://doi.org/10.1145/3226034Google ScholarDigital Library
- Maja J Mataric. 2006. Situated Robotics. American Cancer Society. https://doi.org/10.1002/0470018860.s00074 arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1002/0470018860.s00074Google Scholar
- David Sirkin, Brian Mok, Stephen Yang, andWendy Ju. 2015. Mechanical Ottoman: How Robotic Furniture Offers and Withdraws Support. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI '15). ACM, New York, NY, USA, 11--18. https://doi.org/10.1145/2696454.2696461Google ScholarDigital Library
- M. Weiser. 1993. Hot topics-ubiquitous computing. Computer 26, 10 (Oct 1993), 71--72. https://doi.org/10.1109/2.237456Google ScholarDigital Library
- Mark Weiser. 1999. The Computer for the 21st Century. SIGMOBILE Mob. Comput. Commun. Rev. 3, 3 (July 1999), 3--11. https://doi.org/10.1145/329124.329126Google ScholarDigital Library
Index Terms
- Robots as Furniture, Integrating Human-Computer Interfaces into the Built Environment
Recommendations
Exploring the use of tangible user interfaces for human-robot interaction: a comparative study
CHI '08: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsIn this paper we suggest the use of tangible user interfaces (TUIs) for human-robot interaction (HRI) applications. We discuss the potential benefits of this approach while focusing on low-level of autonomy tasks. We present an experimental robotic ...
Single robot - Multiple human interaction via intelligent user interfaces
This project addresses some research issues concerning design of intelligent user interfaces for improving human-robot interaction. In some critical applications, users interact with robots via Graphical User Interfaces (GUIs), which usually contain ...
Multi-touch interaction for tasking robots
HRI '10: Proceedings of the 5th ACM/IEEE international conference on Human-robot interactionThe objective is to develop a mobile human-robot interface that is optimized for multi-touch input. Our existing interface was designed for mouse and keyboard input and was later adopted for voice and touch interaction. A new multi-touch interface ...
Comments