Skip to main content
Log in

Kinect Controlled UGV

  • Published:
Wireless Personal Communications Aims and scope Submit manuscript

Abstract

Extensive research has been done on various techniques for navigation of automobiles and robots using diverse motion sensing input devices. This paper presents a unique approach of using Kinect for navigation with options of controlling manually or autonomously or with combination of both. A rover with the navigation control through human gestures, voice and self (adaptive) is presented. The prime application of this project is in the automotive industries. A prototype is developed and tested. The main objective is to facilitate the users by introducing gesture and voice control automobile model. The Kinect (optical sensor) allows users to experience a human robot interaction. It is able to provide an intuitive, robust and fun form of interaction It can control a wide range of operations such as acceleration, braking, steering, media, volumes etc. just from gesture or voice. This rover can be used in home, industries, hospitals, restaurants etc. The system is intelligent and adaptive to avoid obstacles on its path. Gesture and voice controlling is very helpful for handicapped people to achieve certain tasks, such as driving etc. Rover can also be used in auto mode to follow the points that are given through Google/Bing map. The auto mode converts the rover into a self-driving car.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. da Silva Bartolo, P. J. (2011). Innovative developments in virtual and physical prototyping. Boca Raton: CRC Press.

    Book  Google Scholar 

  2. Uden, L. (2013). The 2nd international workshop on learning technology for education in cloud. Berlin: Springer.

    Google Scholar 

  3. Ma, M., Jain, L. C., & Anderson, P. (2014). Virtual, augmented reality and serious games for healthcare 1. Berlin: Springer.

    Book  Google Scholar 

  4. Megalingam, R. K., Mahadevan, A., & Ashis, P. K. (2012). Kinect based humanoid for rescue operations in disaster hit areas. International Journal of Applied Engineering Research, 7(11), 1734–1738.

    Google Scholar 

  5. Ani Hsieh, M., & Chirikjian, G. (2014). Distributed autonomous robotic systems. Berlin: Springer.

    Book  Google Scholar 

  6. Kean, S., Hall, J. C., & Perry, P. (2012). Meet the kinect: an introduction to programming natural user interfaces. New York: Apress (Distributed by Springer Science Business Media New York).

    Google Scholar 

  7. Borenstein, G., Odewahn, A., & Jepson, B. (2012). Making things see: 3D vision with kinect, processing, Arduino, and MakerBot. Sebastopol, CA: O’Reilly.

    Google Scholar 

  8. Loguidice, B., & Loguidice, C. (2012). My Xbox: Xbox 360, Kinect, and Xbox LIVE. London: Que Publishing.

    Google Scholar 

  9. Sales, F. F. (2014). SLAM and localization of people with a mobile robot using a RGB-D sensor. Coimbra: University of Coimbra.

    Google Scholar 

  10. Szewczyk, R. (2015). Progress in automation, robotics and measuring techniques: Control and automation. Berlin: Springer.

    Google Scholar 

  11. Kim, J.-H. (2014). Robot intelligence technology and applications. Berlin: Springer.

    Book  Google Scholar 

  12. Siegwart, R., Nourbakhsh, I. R., & Scaramuzza, D. (2011). Introduction to autonomous mobile robots. Cambridge: MIT Press.

    Google Scholar 

  13. Carmody, T. (2010). The prototype for Microsoft’s kinect camera. http://www.wired.com/images_blogs/gadgetlab/2010/11/Canesta-howitworks1.jpg. Accessed November, 11th, 2015.

  14. Yang, W. (2008). Autonomous robots research advances. New York: Nova Publishers.

    Google Scholar 

  15. Groß, R., Alboul, L., Melhuish, C., Witkowski, M., Prescott, T. J., & Penders, J. (2011). Towards autonomous robotic systems. Berlin: Springer.

    Book  Google Scholar 

  16. Stephanidis, C., & Antona, M. (2014). Universal access in human–computer interaction. Berlin: Springer.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Samreen Amir.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Amir, S., Waqar, A., Siddiqui, M.A. et al. Kinect Controlled UGV. Wireless Pers Commun 95, 631–640 (2017). https://doi.org/10.1007/s11277-016-3915-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11277-016-3915-3

Keywords

Navigation