Skip to main content

Reliable Workspace Monitoring in Safe Human-Robot Environment

  • Conference paper
  • First Online:
International Joint Conference SOCO’16-CISIS’16-ICEUTE’16 (SOCO 2016, CISIS 2016, ICEUTE 2016)

Abstract

The implementation of a reliable vision system for full perception of the human-robot environment is a key issue for the flexible collaborative production industries, especially for the frequently changing applications. The use of such system facilitates the perception and recognition of the human activity, and consequently highly increases the robustness and reactivity of safety strategies in collaborative tasks. This paper presents an implementation of several techniques for workspace monitoring in collaborative human-robot applications. A reliable perception of the overall environment is performed to generate a consistent point cloud which is used for human detection and tracking. Additionally, safety strategies on the robotic system (reduced velocity, emergency stop, ...) are activated when the human-robot distance approaches predefined security thresholds.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Notes

  1. 1.

    An implementation of the presented techniques in the scope of LIAA project is available on https://youtu.be/AtZGeX2t51k.

References

  1. ISO: ISO 10218–1: Robots and robotic devices-safety requirements for industrial robots-part 1: Robots. Geneva, Switzerland: International Organization for Standardization (2011)

    Google Scholar 

  2. Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R., Ng, A.Y.: Ros: an open-source robot operating system. In: ICRA Workshop on Open Source Software, vol. 3, p. 5 (2009)

    Google Scholar 

  3. Johnson, B., Greenberg, S.: Judging people’s availability for interaction from video snapshots. In: Proceedings of the 32nd Annual Hawaii International Conference on Systems Sciences, HICSS-32, p. 9. IEEE (1999)

    Google Scholar 

  4. Ning, H., Han, T.X., Walther, D.B., Liu, M., Huang, T.S.: Hierarchical space-time model enabling efficient search for human actions. IEEE Trans. Circ. Syst. Video Technol. 19(6), 808–820 (2009)

    Article  Google Scholar 

  5. Gupta, A., Srinivasan, P., Shi, J., Davis, L.S.: Understanding videos, constructing plots learning a visually grounded storyline model from annotated videos. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2009, pp. 2012–2019. IEEE (2009)

    Google Scholar 

  6. Wu, J., Osuntogun, A., Choudhury, T., Philipose, M., Rehg, J.M.: A scalable approach to activity recognition based on object use. In: IEEE 11th International Conference on Computer Vision, ICCV 2007, pp. 1–8. IEEE (2007)

    Google Scholar 

  7. Laptev, I.: On space-time interest points. Int. J. Comput. Vis. 64(2–3), 107–123 (2005)

    Article  Google Scholar 

  8. Dollár, P., Rabaud, V., Cottrell, G., Belongie, S.: Behavior recognition via sparse spatio-temporal features. In: 2nd Joint IEEE International Workshop on Visual Surveillance and Performance Evaluation of Tracking and Surveillance, pp. 65–72. IEEE (2005)

    Google Scholar 

  9. Liu, J., Ali, S., Shah, M.: Recognizing human actions using multiple features. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2008, pp. 1–8. IEEE (2008)

    Google Scholar 

  10. Jhuang, H., Serre, T., Wolf, L., Poggio, T.: A biologically inspired system for action recognition. In: IEEE 11th International Conference on Computer Vision, ICCV 2007, pp. 1–8. IEEE (2007)

    Google Scholar 

  11. Rodriguez, M.D., Ahmed, J., Shah, M.: Action mach a spatio-temporal maximum average correlation height filter for action recognition. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2008, pp. 1–8. IEEE (2008)

    Google Scholar 

  12. Boiman, O., Irani, M.: Detecting irregularities in images and in video. Int. J. Comput. Vis. 74(1), 17–31 (2007)

    Article  Google Scholar 

  13. Liao, L., Fox, D., Kautz, H.: Extracting places and activities from GPS traces using hierarchical conditional random fields. Int. J. Robot. Res. 26(1), 119–134 (2007)

    Article  Google Scholar 

  14. Zhu, C., Sheng, W.: Human daily activity recognition in robot-assisted living using multi-sensor fusion. In: IEEE International Conference on Robotics and Automation, ICRA 2009, pp. 2154–2159. IEEE (2009)

    Google Scholar 

  15. Marcon, M., Pierobon, M., Sarti, A., Tubaro, S.: 3d markerless human limb localization through robust energy minimization. In: Workshop on Multi-camera and Multi-modal Sensor Fusion Algorithms and Applications, M2SFA2 2008 (2008)

    Google Scholar 

  16. The OpenNI Organization: Introducing openni, open natural interaction library. http://www.openni.org. Accessed: 30 Nov 2015

  17. Munaro, M., Menegatti, E.: Fast RGB-D people tracking for service robots. Auton. Robots 37(3), 227–242 (2014)

    Article  Google Scholar 

  18. Munaro, M., Basso, F., Menegatti, E.: Tracking people within groups with RGB-D data. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2101–2107. IEEE (2012)

    Google Scholar 

  19. Noorit, N., Suvonvorn, N., Karnchanadecha, M.: Model-based human action recognition. In: Second International Conference on Digital Image Processing, p. 75460P. International Society for Optics and Photonics (2010)

    Google Scholar 

  20. LIAA: Lean intelligent assembly automation. http://www.project-leanautomation.eu. Accessed: 05 Jun 2016

  21. Noonan, P.J., Anton-Rodriguez, J.M., Cootes, T.F., Hallett, W.A., Hinz, R.: Multiple target marker tracking for real-time, accurate, and robust rigid body motion tracking of the head for brain pet. In: 2013 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC), pp. 1–6. IEEE (2013)

    Google Scholar 

  22. Niekum, S.: ROS wrapper for alvar, an open source ar tag tracking library. http://wiki.ros.org/ar_track_alvar. Accessed: 30 Nov 2015

  23. Kammerl, J., Woodall, W.: PCL (point cloud library) ros interface stack. http://wiki.ros.org/pcl_ros. Accessed: 30 Nov 2015

Download references

Acknowledgments

The research leading to these results has been funded in part by the European Union’s seventh framework program (FP7/2007-2013) under grant agreements #608604 (LIAA: Lean Intelligent Assembly Automation).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Héctor Herrero .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Moughlbay, A.A., Herrero, H., Pacheco, R., Outón, J.L., Sallé, D. (2017). Reliable Workspace Monitoring in Safe Human-Robot Environment. In: Graña, M., López-Guede, J.M., Etxaniz, O., Herrero, Á., Quintián, H., Corchado, E. (eds) International Joint Conference SOCO’16-CISIS’16-ICEUTE’16. SOCO CISIS ICEUTE 2016 2016 2016. Advances in Intelligent Systems and Computing, vol 527. Springer, Cham. https://doi.org/10.1007/978-3-319-47364-2_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-47364-2_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-47363-5

  • Online ISBN: 978-3-319-47364-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics