Skip to main content

Wide Field of View Kinect Undistortion for Social Navigation Implementation

  • Conference paper
Advances in Visual Computing (ISVC 2012)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 7432))

Included in the following conference series:

Abstract

In planning navigation schemes for social robots, distinguishing between humans and other obstacles is crucial for obtaining a safe and comfortable motion. A Kinect camera is capable of fulfilling such a task but unfortunately can only deliver a limited field of view (FOV). Recently a lens that is capable of improving the Kinect’s FOV has become commercially available from Nyko. However, this lens causes a distortion in the RGB-D data, including the depth values. To address this issue, we propose a two-staged undistortion strategy. Initially, pixel locations in both RGB and depth images are corrected using an inverse radial distortion model. Next, the depth data is post-filtered using 3D point cloud analysis to diminish the noise as a result of the undistorting process and remove the ground/ceiling information. Finally, the depth values are rectified using a neural network filter based on laser-assisted training. Experimental results demonstrate the feasibility of the proposed approach for fixing distorted RGB-D data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bonin, F., Ortiz, A., Oliver, B.: Visual Navigation for Mobile Robots: A Survey. Journal of Intelligent and Robotic Systems 53(3), 263–296 (2008)

    Article  Google Scholar 

  2. Engelhard, N., Endres, F., Hess, J., Sturm, J., Burgard, W.: Real-time 3D Visual SLAM with a Hand-Held RGB-D Camera. In: Proc. of the RGB-D Workshop on 3D Perception in Robotics at the European Robotics Forum (2011)

    Google Scholar 

  3. Tran, J.: Low-Cost 3D Scene Reconstruction for Response Robots in Real Time. In: Proc. of IEEE Intl. Symp. on Safety, Security, and Rescue Robotics, pp. 161–166 (2011)

    Google Scholar 

  4. Tomari, R., Kobayashi, Y., Kuno, Y.: Multi-view Head Detection and Tracking with Long Range Capability for Social Navigation Planning. In: Bebis, G., Boyle, R., Parvin, B., Koracin, D., Wang, S., Kyungnam, K., Benes, B., Moreland, K., Borst, C., DiVerdi, S., Yi-Jen, C., Ming, J. (eds.) ISVC 2011, Part II. LNCS, vol. 6939, pp. 418–427. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  5. Brown, D.: Decentering Distortion of Lenses. Photogrammetric Eng. 7, 444–462 (1966)

    Google Scholar 

  6. Chang, C., Su, C.: A Comparison of a Statistical Regression and Neural Network Methods in Modeling Measurement Errors for Computer Vision Inspection Systems. Computers & Industrial Engineering 28(3), 593–603 (1995)

    Article  Google Scholar 

  7. Shah, S., Aggarwal, J.K.: Mobile Robot Navigation and Scene Modeling Using Stereo Fish-Eye Lens System. Machine Vision and Application, 159–173 (1996)

    Google Scholar 

  8. de Villers, J., Nicolls, F.: Application of Neural Networks to Inverse Lens Distortion Modeling. In: Proc. of 21st Annual Symposium of the Pattern Recognition Society of South Africa, vol. 1, pp. 63–68 (2010)

    Google Scholar 

  9. Ahmed, M., Hemayed, E., Farag, A.: Neurocalibration: A Neural Network that Can Tell Camera Calibration Parameters. IEEE Trans. PAMI 79, 384–390 (1999)

    Google Scholar 

  10. Smith, L.N., Smith, M.L.: Automatic Machine Vision Calibration Using Statistical and Neural Network Methods. Image and Vision Computing 23, 887–899 (2005)

    Article  Google Scholar 

  11. http://www.nyko.com

  12. http://www.ros.org/wiki/kinect_calibration/technical

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Tomari, R., Kobayashi, Y., Kuno, Y. (2012). Wide Field of View Kinect Undistortion for Social Navigation Implementation. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2012. Lecture Notes in Computer Science, vol 7432. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33191-6_52

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33191-6_52

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33190-9

  • Online ISBN: 978-3-642-33191-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics