Skip to main content

Advertisement

Log in

Perception for Everyday Human Robot Interaction

  • Technical Contribution
  • Published:
KI - Künstliche Intelligenz Aims and scope Submit manuscript

Abstract

The ability to build robotic agents that can perform everyday tasks heavily depends on understanding how humans perform them. In order to achieve close to human understanding of a task and generate a formal representation of it, it is important to jointly reason about the human actions and the objects that are being acted on. We present a robotic perception framework for perceiving actions performed by a human in a household environment that can be used to answer questions such as “which object did the human act on?” or “which actions did the human perform?”. To do this we extend the RoboSherlock framework with the capabilities of detecting humans and objects at the same time, while simultaneously reasoning about the possible actions that are being performed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Explore related subjects

Discover the latest articles and news from researchers in related subjects, suggested using machine learning.

References

  1. Yazdani F, Brieber B, Beetz M (2014) Cognition-enabled robot control for mixed human-robot rescue teams. In: Proceedings of the 13th International Conference on Intelligent Autonomous Systems (IAS-13)

  2. Beetz M, Bartels G, Albu-Schäffer A, Bálint-Benczédi F, Belder R, Beßler D, Haddadin S, Maldonado A, Mansfeld N, Wiedemeyer T, Weitschat R, Worch JH (2015) Robotic agents capable of natural and safe physical interaction with human co-workers. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Hamburg, Germany

  3. Beetz M, Jain D, Mosenlechner L, Tenorth M, Kunze L, Blodow N, Pangercic D (2012) Cognition-enabled autonomous robot control for the realization of home chore task intelligence. Proc IEEE 100(8):2454–2471

    Article  Google Scholar 

  4. Beetz M, Tenorth M, Winkler J (2015) Open-EASE—a knowledge processing service for robots and robotics/ai researchers. In: IEEE International Conference on Robotics and Automation (ICRA). Seattle, Washington

  5. Beetz M, Balint-Benczedi F, Blodow N, Nyga D, Wiedemeyer T, Marton ZC (20150 RoboSherlock: unstructured information processing for robot perception. In: IEEE International Conference on Robotics and Automation (ICRA). Seattle, Washington

  6. Ferrucci D, Lally A (2004) UIMA: an architectural approach to unstructured information processing in the corporate research environment. In: Natural Language Engineering, vol. 10, pp. 3–4. Cambridge University Press

  7. Ferrucci D, Brown E, Chu-Carroll J, Fan J, Gondek D, Kalyanpur AA, Lally A, Murdock JW, Nyberg E, Prager J, Schlaefer N, Welty C (2010) Building Watson: an overview of the DeepQA project. AI Mag 31(3):59–79

    Google Scholar 

  8. Nyga D, Balint-Benczedi F, Beetz M (2014) PR2 Looking at things: ensemble learning for unstructured information processing with Markov logic networks. In: IEEE International Conference on Robotics and Automation (ICRA). Hong Kong, China

  9. Tenorth M, Beetz M (2009) KnowRob—knowledge processing for autonomous personal robots. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4261–4266

  10. Wiedemeyer T, Balint-Benczedi F, Beetz M (2015) Pervasive ’calm’ perception for autonomous robotic agents. In: Proceedings of the 2015 International Conference on Autonomous Agents and Multiagen Systems. ACM, Istanbul

  11. Patterson D, Fox D, Kautz H, Philipose M (2005) Fine-grained activity recognition by aggregating abstract object usage. In: Proc. of the International Symposium on Wearable Computers

  12. Billard A, Calinon S, Dillmann R, Schaal S (2008) Robot programming by demonstration. In: Springer Hand-book of Robotics, Springer, ch. 59

  13. Kim S, Shukla A, Billard A (2014) Catching objects in flight. EEE Trans Robot 30(5):1049–1065

    Article  Google Scholar 

  14. Nguyen H, Jain A, Anderson C, Kemp C (2008) A clickable world: behavior selection through pointing and context for mobile manipulation. In: Intelligent robots and systems. IROS 2008. IEEE/RSJ International Conference on, pp. 787–793

  15. Blodow N, Marton ZC, Pangercic D, Rühr T, Tenorth M, Beetz M (2011) Inferring generalized pick-and-place tasks from pointing gestures. In: IEEE International Conference on Robotics and Automation (ICRA), Workshop on Semantic Perception, Mapping and Exploration

  16. Nyga D, Tenorth M, Beetz M (2011) How-models of human reaching movements in the context of everyday manipulation activities. In: IEEE International Conference on Robotics and Automation (ICRA). Shanghai, China

  17. Haidu A, Kohlsdorf D, Beetz M (2014) Learning task outcome prediction for robot control from interactive environments. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Chicago, USA

Download references

Acknowledgments

This work was supported in part by the EU FP7 Projects RoboHow (Grant Agreement Number 288533), SAPHARI (Grant Agreement Number 287513) and ACAT (Grant Agreement Number 600578) and by the German Research Foundation (DFG) as part of the Project MeMoMan2.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jan-Hendrik Worch.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Worch, JH., Bálint-Benczédi, F. & Beetz, M. Perception for Everyday Human Robot Interaction. Künstl Intell 30, 21–27 (2016). https://doi.org/10.1007/s13218-015-0400-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13218-015-0400-1

Keywords