Skip to main content

Multispectral Stereoscopic Robotic Head Calibration and Evaluation

  • Conference paper
  • First Online:
Modelling and Simulation for Autonomous Systems (MESAS 2015)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9055))

Abstract

The aim of the paper is to describe the data-fusion from optical sensors for mobile robotics reconnaissance and mapping. Data are acquired by stereo pair of CCD cameras, stereo pair of thermal imagers, and TOF (time-of-flight) range camera.

The described calibration and data-fusion algorithms may be used for two purposes: visual telepresence (remote control) under extremely wide variety of visual conditions, like fog, smoke, darkness, etc., and for multispectral autonomous digital mapping of the robot’s environment.

The fusion is realized by means of spatial data from a TOF camera - the thermal and CCD camera data are comprised in one multispectral 3D model for mapping purposes or stereo image presented to a binocular, head-mounted display. The data acquisition is performed using a sensor head containing the mentioned 5 cameras, which is placed on 3 degrees-of-freedom (DOF) manipulator on Orpheus-X3 reconnaissance robot; both the head and the robot were developed by our working group.

Although the fusion is used for two different tasks – automatic environment mapping and visual telepresence, the utilized calibration and fusion algorithms are, in principle, the same.

Both geometrical calibration of each sensor, and the mutual positions of the sensors in 6-DOFs are calculated from calibration data acquired from newly developed multispectral calibration pattern. For the fusion the corresponding data from the CCD camera and the thermal imager are determined via homogeneous and perspective transformations. The result consists of image containing aligned data from the CCD camera and the thermal imager for each eye or a set of 3D points supplied by color and thermal information. Precision of data-fusion is determined both by calculation from mathematical model and experimental real-scenario evaluation. Precision of data-fusion and subsequently calibration is evaluated by real-environment measurements with help of newly developed multispectral targets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zalud, L.: ORPHEUS – reconniaissance teleoperated robotic system. In: 16th IFAC World Congress, Prague, Czech Republic, pp. 1–6 (2005)

    Google Scholar 

  2. Zalud, L., Burian, F., Kopecny, L., Kocmanova, P.: Remote robotic exploration of contaminated and dangerous areas. In: International Conference on Military Technologies, Brno, Czech Republic, pp. 525–532 (2013). ISBN 978-80-7231-917-6

    Google Scholar 

  3. SR4000 Data Sheet, MESA Imaging AG. Rev. 5.1 (2011)

    Google Scholar 

  4. Kocmanova, P., Zalud, L.: Spatial calibration of TOF camera, thermal imager and CCD camera. In: Mendel 2013: 19th International Conference on Soft Computing, pp. s.343–s.348. Brno University of Technology, Fakulty of Mechanical Engineering, Brno (2013). ISBN 978-80-214-4755-4

    Google Scholar 

  5. Zalud, L., Kocmanova, P.: Fusion of thermal imaging and CCD camera-based data for stereovision visual telepresence. In: SSRR 2013: 11th IEEE International Symposium on Safety Security and Rescue Robotics, pp. 1–6 (2013)

    Google Scholar 

  6. Scaramuzza, D.: OCamCalib: Omnidirectional Camera Calibration Toolbox for Matlab. https://sites.google.com/site/scarabotix/ocamcalib-toolbox

  7. Zhang, Z.: Flexible camera calibration by viewing a plane from unknown orientations. In: Proceedings of the Seventh IEEE International Conference on Computer Vision, vol. 1, pp. 666–673. IEEE (1999)

    Google Scholar 

  8. El khrachy, I.A.E.H.M.: Towards an automatic registration for terrestrial laser scanner data. Technische Universität Carolo-Wilhelmina, Braunschweig (2007)

    Google Scholar 

  9. Kocmanova, P., Zalud, L., Burian, F., Jilek, T.: Multispectral data fusion for robotic reconnaissance and mapping. In: 2014 11th International Conference on Informatics in Control, Automation and Robotics (ICINCO), vol. 02, pp. 459–466, 1–3 September 2014

    Google Scholar 

Download references

Acknowledgement

This work was supported by VG 2012 2015 096 grant named Cooperative Robotic Exploration of Dangerous Areas by Ministery of Interior, Czech Republic, program BV II/2-VS.

This work was supported by the project CEITEC - Central European Institute of Technology (CZ.1.05/1.1.00/02.0068) from the European Regional Development Fund.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ludek Zalud .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Kocmanova, P., Zalud, L. (2015). Multispectral Stereoscopic Robotic Head Calibration and Evaluation. In: Hodicky, J. (eds) Modelling and Simulation for Autonomous Systems. MESAS 2015. Lecture Notes in Computer Science(), vol 9055. Springer, Cham. https://doi.org/10.1007/978-3-319-22383-4_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-22383-4_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-22382-7

  • Online ISBN: 978-3-319-22383-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics