Loading [MathJax]/extensions/MathMenu.js
Multiple RGB-D Camera-based User Intent Position and Object Estimation | IEEE Conference Publication | IEEE Xplore

Multiple RGB-D Camera-based User Intent Position and Object Estimation


Abstract:

Human gaze represents the area of interests of the person. By analyzing a time-series of these areas, it is possible to obtain user behavioral pattern that can be used in...Show More

Abstract:

Human gaze represents the area of interests of the person. By analyzing a time-series of these areas, it is possible to obtain user behavioral pattern that can be used in various fields. Well-known techniques for estimating human gaze are inconvenient because they require a wearable device, or the measurement area is relatively narrow. In this paper, a method to implement gaze estimation system using 3D view tracking with multi RGB-D camera is proposed. Surround 3D cameras are used to extract the region of interest of the user from 3D gaze estimation without wearable device in living space. To implement proposed method, first, 3D space mapping through multiple RGB-D camera calibration is performed. The resulting 3D map is the measurement area, which depends on the number and specifications of the RGB-D cameras used for this purpose. Then, when a person enters the 3D map, the face region is detected using both 2D / 3D data, and 3D view tracking is implemented by detecting the gaze vector using the facial feature point and the head data center point extracted from the 3D map. Finally, when the gaze vector line intersects a specific point within the mapping space, the image coordinates corresponding to that point are extracted to implement user Intent position estimation. Applying object detection and classification algorithm to the extracted image can also estimate the intent object at that time.
Date of Conference: 09-12 July 2018
Date Added to IEEE Xplore: 02 September 2018
ISBN Information:
Electronic ISSN: 2159-6255
Conference Location: Auckland, New Zealand

References

References is not available for this document.