Abstract
There are more and more application scenarios for unmanned delivery vehicles, and traditional human-computer interaction methods can no longer meet different task scenarios and user needs. The primary purpose of this paper is to apply natural human-computer interaction technology to the field of unmanned delivery, changing the traditional mode that users can only operate unmanned delivery vehicles through the touch screen to complete tasks. The task scenarios of unmanned delivery vehicles are classified through the concept of context. Participatory design and heuristic research are used to allow users to define interactive gestures. Two groups of gesture interaction set that meet different task scenarios and can be accepted and understood by general users are designed. Based on Kinect’s deep imaging and bone tracking technology, large number of preset gesture samples are collected, and the Adaboost algorithm is used for machine training to realize gesture interaction. Through recognition and detection, it is proved that the gesture recognition achieved by this method has a high recognition rate and responding speed. Finally, based on Unity3D, the task scene of the real unmanned delivery vehicle is simulated in the virtual scene. Through the usability test of the human-computer interaction system, it is concluded that this interaction mode guarantees the task efficiency to a certain extent and improves the user experience.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Lu, N., Ying, X.: Development status and application prospects of “last mile” unmanned vehicle distribution. Integr. Transp. 43(1), 117–121 (2020)
Tao, X.: Research on the history and trend of human-computer interaction. Technol. Commun. 11(22), 137–139 (2019)
Deng, R., Zhou, L., Ying, R.: Research on gesture extraction and recognition based on Kinect depth information. Appl. Res. Comput. 30(4), 1263–1265 (2013)
Albrecht, S.: The Encyclopedia of Human-Computer Interaction. Liverpool John Moores University, UK (2016)
Boulabiar, M.-I., Coppin, G., Poirier, F.: the study of the full cycle of gesture interaction, the continuum between 2D and 3D. In: Kurosu, M. (ed.) Human-Computer Interaction. Advanced Interaction Modalities and Techniques. LNCS, vol. 8511, pp. 24–35. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-07230-2_3
Sun, X., Zhou, B., Li, T.: Design key elements and principles of in-air gesture-based interaction. Packag. Eng. 36(8), 10–13 (2015)
Jessica, R., Jane, L.: Drone & me: an exploration into natural human-drone interaction. UBICOMP 15(9), 7–11 (2015)
Williams, A.S., Garcia, J., Ortega, F.: Understanding multimodal user gesture and speech behavior for object manipulation in augmented reality using elicitation. IEEE Trans. Visual. Comput. Graphics 26(12), 12 (2020)
Vatavu, R.: User-defined gestures for free-hand TV control. In: Proceedings of the 10th European Conference on Interactive TV and Video 2012, pp. 45–48. ACM, Berlin (2012)
Rojas, R.: Adaboost and the super bowl of classifiers a tutorial introduction to adaptive boosting. Int. Conf. Image Process. 28(12), 156–170 (2009)
Nielsen, J.: Non-command user interfaces. Commun. ACM 36(4), 83–99 (1993)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, K., Liu, L. (2021). Design of Natural Human-Computer Interaction for Unmanned Delivery Vehicle Based on Kinect. In: Krömker, H. (eds) HCI in Mobility, Transport, and Automotive Systems. HCII 2021. Lecture Notes in Computer Science(), vol 12791. Springer, Cham. https://doi.org/10.1007/978-3-030-78358-7_10
Download citation
DOI: https://doi.org/10.1007/978-3-030-78358-7_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-78357-0
Online ISBN: 978-3-030-78358-7
eBook Packages: Computer ScienceComputer Science (R0)