Abstract:
This paper demonstrates an autonomous living building system that adapts its behavior to changing environment as they are happening. The system implements a real-time and...Show MoreMetadata
Abstract:
This paper demonstrates an autonomous living building system that adapts its behavior to changing environment as they are happening. The system implements a real-time and real-world scenario at the University of Houston where multiple Microsoft Kinect sensors have been utilized to demonstrate the living building capability. The sensors in Kinect generate point clouds data representing the dynamic context; a static context is captured using the CAD tool and archived in the system to model the built environment. Both static and dynamic context information are processed and summarized to infer the interaction between the occupant (dynamic context) and the building (static context). The results are then taken through an Artificial Intelligent (AI) engine by using supervised machine learning algorithm to infer the decision the building should take to adjust the environment. The implemented smart building prototype was demonstrated in a home library setting and provides a clear evidence of the capability of such systems. The system reacted within the acceptable timeframe expected by a human activity (seconds). The prototype sets the stage of a more complicated scenario study that uses hundreds of sensors and actuators. The prototype will be used as testbed to study the distributed control and communication system for living building concept.
Date of Conference: 24-28 June 2019
Date Added to IEEE Xplore: 22 July 2019
ISBN Information: