ABSTRACT
Smart buildings have become an essential part of our daily life. Due to the increased availability of addressable fixtures and controls, the operation of building components has become complex for occupants, especially when multimodal controls are involved. Thus, occupants are confronted with the need to manage these controls in a user-friendly way. We propose the MIBO IDE, an integrated development environment for defining, connecting, and managing multimodal controls designed with end-users in mind. The MIBO IDE provides a convenient way for occupants to create and customize multimodal interaction models while taking the occupants' preferences, culture, and potential physical limitations into account. The MIBO IDE allows occupants to apply and utilize various components such as the MIBO Editor. This core component can be used to define interaction models visually using the drag-and-drop metaphor and therefore without requiring programming skills. In addition, the MIBO IDE provides components for debugging and compiling interaction models as well as detecting conflicts among them.
- M. Beaudouin-Lafon. Instrumental interaction: An interaction model for designing post-wimp user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’00, pages 446–453, New York, NY, USA, 2000. ACM. Google ScholarDigital Library
- F. Cuenca, K. Coninx, D. Vanacken, and K. Luyten. Graphical toolkits for rapid prototyping of multimodal systems: A survey. Interacting with Computers, 27(4):470–488, 2015.Google ScholarCross Ref
- P. Dragicevic and J.-D. Fekete. Input device selection and interaction configuration with icon. In A. Blandford, J. Vanderdonckt, and P. Gray, editors, People and Computers XV—Interaction without Frontiers, pages 543–558. London: Springer, 2001.Google Scholar
- B. Dumas, B. Signer, and D. Lalanne. A graphical editor for the smuiml multimodal user interaction description language. Science of Computer Programming, 86:30 – 42, 2014.Google Scholar
- S. Hekmatpour. Experience with evolutionary prototyping in a large software project. SIGSOFT Software Engineering Notes, 12(1):38–41, 1987. Google ScholarDigital Library
- M. Johnston. Multimodal Language Processing. Center for Human-Computer Communication Oregon Graduate Institute, pages 2–5, 1998.Google Scholar
- M. Johnston. Unification-based multimodal parsing. Annual Meeting of the ACL, page 624, 1998. Google ScholarDigital Library
- M. Johnston. Building multimodal applications with emma. In Proceedings of the 2009 International Conference on Multimodal Interfaces, ICMI-MLMI ’09, pages 47–54, New York, NY, USA, 2009. ACM. Google ScholarDigital Library
- M. Johnston and S. Bangalore. Finite-state multimodal parsing and understanding. International Conference On Computational Linguistics, page 369, 2000. Google ScholarDigital Library
- M. Johnston and S. Bangalore. Finite-state multimodal integration and understanding. Natural Language Engineering, 11:159–187, 2005. Google ScholarDigital Library
- J.-Y. L. Lawson, A.-A. Al-Akkad, J. Vanderdonckt, and B. Macq. An open source workbench for prototyping multimodal interactions based on off-the-shelf heterogeneous components. In Proceedings of the 1st ACM SIGCHI Symposium on Engineering Interactive Computing Systems, EICS ’09, pages 245–254, New York, NY, USA, 2009. Google ScholarDigital Library
- Luqi. Software evolution through rapid prototyping. Computer, 22(5):13–25, 1989. Google ScholarDigital Library
- S. Peters. Mibo – a framework for the integration of multimodal intuitive controls in smart buildings. Ph.D. Dissertation, Technische Universität München. https:// mediatum.ub.tum.de/ node? id=1304127.Google Scholar
- D. Roscher, M. Blumendorf, and S. Albayrak. Using meta user interfaces to control multimodal interaction in smart environments. In G. Meixner, D. Grlich, K. Breiner, H. Humann, P. A., S. Sauer, and J. Van den Bergh, editors, Proceedings of the IUI’09 Workshop on Model Driven Development of Advanced User Interfaces, volume 439, 2009. Google ScholarDigital Library
- M. Serrano, L. Nigay, J.-Y. L. Lawson, A. Ramsay, R. Murray-Smith, and S. Denef. The openinterface framework: A tool for multimodal interaction. In CHI ’08 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’08, pages 3501–3506, New York, NY, USA, 2008. ACM. Google ScholarDigital Library
Index Terms
- An IDE for multimodal controls in smart buildings
Recommendations
On the Complexity of Smart Buildings Occupant Behavior: Risks and Opportunities
BCI '17: Proceedings of the 8th Balkan Conference in InformaticsSmart buildings are run by Cyber-Physical Systems (CPS), termed as Building Management Systems (BMS). Typical goals for the operation of BMS are increasing occupant comfort and decreasing buildings energy consumption. The central and critical figure, ...
The future of work, workplaces and smart buildings
BuildSys '22: Proceedings of the 9th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and TransportationThis paper presents a discussion on how smart buildings and technologies currently and will continue to contribute to the future of work and workplaces. In the aftermath of the COVID-19 pandemic, a hybrid way of working has emerged and physical office ...
Collaborative data analytics for smart buildings: opportunities and models
AbstractSmart buildings equipped with state-of-the-art sensors and meters are becoming more common. Large quantities of data are being collected by these devices. For a single building to benefit from its own collected data, it will need to wait for a ...
Comments