Abstract
This paper introduces a medium-scale point cloud dataset for semantic SLAM (Simultaneous Localization and Mapping) acquired using a SwissRanger time-of-flight camera. An indoor environment with relatively unfluctuating lighting conditions is considered for mapping and localization. The camera is positioned on a mobile tripod and ready to capture images at prearranged locations in the environment. The prearranged locations are in fact used as ground truth for estimating the variance with poses calculated from SLAM, and also as initial pose estimates for the ICP algorithm (Iterative Closest Point). An interesting point is that, in this work, no type of Inertial Measurement Units or visual odometry techniques has been utilized, given the fact that, data from time-of-flight cameras is noisy and sensitive to external conditions (such as lighting, transparent surfaces, parallel overlapping surfaces etc.). Furthermore, a large collection of household objects is made in order to label the scene with semantic information. The whole SLAM dataset with pose files along with the point clouds of household objects is a major contribution in this paper apart from mapping and plane detection using a publicly available toolkit. Also, a novel metric, a context-based similarity score, for evaluating SLAM algorithms is presented.
This work is supported by the Labex IMobS3.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bao, S.Y., Bagra, M., Chao, Y.W., Savarese, S.: Semantic structure from motion with points, regions, and objects. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2703–2710 (2012). https://doi.org/10.1109/CVPR.2012.6247992
Borrmann, D., Elseberg, J., Lingemann, K., Nüchter, A., Hertzberg, J.: Globally consistent 3D mapping with scan matching. Robot. Auton. Syst. 56(2), 130–142 (2008). https://doi.org/10.1016/j.robot.2007.07.002, http://www.sciencedirect.com/science/article/pii/S0921889007000863
Cadena, C., Carlone, L., Carrillo, H., Latif, Y., Scaramuzza, D., Neira, J., Reid, I., Leonard, J.J.: Past, present, and future of simultaneous localization and mapping: toward the robust-perception age. IEEE Trans. Robot. 32(6), 1309–1332 (2016). https://doi.org/10.1109/TRO.2016.2624754
Cazorla, M., Viejo, D., Pomares, C.: Study of the sr4000 camera. In: Proceedings of XI Workshop of Physical Agents Fısicos, Valencia, Spain (2010)
Chiabrando, F., Chiabrando, R., Piatti, D., Rinaudo, F.: Sensors for 3d imaging: metric evaluation and calibration of a ccd/cmos time-of-flight camera. Sensors 9(12), 10080–10096 (2009)
Chiabrando, F., Piatti, D., Rinaudo, F.: SR-4000 Tof camera: further experimental tests and first applications to metric surveys. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 38, 149–154 (2010)
Comport, A., Malis, E., Rives, P.: Real-time quadrifocal visual odometry. Int. J. Rob. Res. 29(2–3), 245–266 (2010). https://doi.org/10.1177/0278364909356601
Comport, A.I., Meill, M., Rives, P.: Real-time dense appearance-based SLAM for RGB-D sensors. In: Proceedings of the 2011 Australasian Conference on Robotics and Automation, pp. 100–109 (2011)
Cui, Y., Schuon, S., Chan, D., Thrun, S., Theobalt, C.: 3D shape scanning with a time-of-flight camera. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 1173–1180 (2010). https://doi.org/10.1109/CVPR.2010.5540082
Dib, A., Beaufort, N., Charpillet, F.: A real time visual SLAM for RGB-D cameras based on chamfer distance and occupancy grid. In: 2014 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, pp. 652–657 (2014). https://doi.org/10.1109/AIM.2014.6878153
Diebel, J., Thrun, S.: An application of Markov random fields to range sensing. In: Paper presented on NIPS, pp. 291–298. MIT Press, Cambridge (2005)
Donoho, D.L.: Denoising by soft-thresholding. IEEE Trans. Inf. Theory 41, 613–627 (1995). https://doi.org/10.1109/18.382009
Dopfer, A., Wang, H.H., Wang, C.C.: 3d active appearance model alignment using intensity and range data. Robot. Auton. Syst. 62(2), 168–176 (2014). https://doi.org/10.1016/j.robot.2013.11.002, http://www.sciencedirect.com/science/article/pii/S0921889013002194
Falie, D., Buzuloiu, V.: Noise characteristics of 3d time-of-flight cameras. In: 2007 International Symposium on Signals, Circuits and Systems, vol. 1, pp. 1–4 (2007). https://doi.org/10.1109/ISSCS.2007.4292693
Foix, S., Alenya, G., Torras, C.: Lock-in time-of-flight (tof) cameras: a survey. IEEE Sens. J. 11(9), 1917–1926 (2011). https://doi.org/10.1109/JSEN.2010.2101060
Fuchs, S., May, S.: Calibration and registration for precise surface reconstruction with time-of-flight cameras. Int. J. Intell. Syst. Technol. Appl. 5(3/4), 274–284 (2008). https://doi.org/10.1504/IJISTA.2008.021290
Ghorpade, V.K., Checchin, P., Malaterre, L., Trassoudaine, L.: Performance evaluation of 3d keypoint detectors for time-of-flight depth data. In: 2016 14th International Conference on Control, Automation, Robotics and Vision (ICARCV), pp. 1–6 (2016). https://doi.org/10.1109/ICARCV.2016.7838686
Ghorpade, V.K., Checchin, P., Malaterre, L., Trassoudaine, L.: 3d shape representation with spatial probabilistic distribution of intrinsic shape keypoints. EURASIP J. Adv. Signal Process. 2017(1), 52 (2017). https://doi.org/10.1186/s13634-017-0483-y
Ghorpade, V.K., Checchin, P., Trassoudaine, L.: Line-of-sight-based tof camera’s range image filtering for precise 3d scene reconstruction. In: 2015 European Conference on Mobile Robots (ECMR), pp. 1–6 (2015). https://doi.org/10.1109/ECMR.2015.7324208
Grisetti, G., Stachniss, C., Burgard, W.: Improved techniques for grid mapping with rao-blackwellized particle filters. IEEE Trans. Robot. 23(1), 34–46 (2007). https://doi.org/10.1109/TRO.2006.889486
Hansard, M., Lee, S., Choi, O., Horaud, R.P.: Time-of-flight cameras: principles, methods and applications. Springer Science & Business Media, London (2012)
He, Y., Liang, B., Zou, Y., He, J., Yang, J.: Depth errors analysis and correction for time-of-flight (tof) cameras. Sensors 17(1), 92 (2017)
Henry, P., Krainin, M., Herbst, E., Ren, X., Fox, D.: RGB-D Mapping: Using Depth Cameras for Dense 3D Modeling of Indoor Environments, pp. 477–491. Springer, Berlin (2014). https://doi.org/10.1007/978-3-642-28572-1_33
Hong, S., Ye, C., Bruch, M., Halterman, R.: Performance evaluation of a pose estimation method based on the swissranger sr4000. In: 2012 IEEE International Conference on Mechatronics and Automation, pp. 499–504 (2012). https://doi.org/10.1109/ICMA.2012.6283123
Iddan, G.J., Yahav, G.: Three-dimensional imaging in the studio and elsewhere. In: Proceedings of the SPIE Three-Dimensional Image Capture and Applications IV. vol. 4298 (2001). https://doi.org/10.1117/12.424913
Jovanov, L., Pižurica, A., Philips, W.: Fuzzy logic-based approach to wavelet denoising of 3d images produced by time-of-flight cameras. Opt. Exp. 18, :22651–22676 (2010). https://doi.org/10.1364/OE.18.022651
Kahlmann, T., Remondino, F., Ingensand, H.: Calibration for increased accuracy of the range imaging camera SwissRangerTM. In: Maas, H.G. (ed.) Proceedings of the ISPRS Commission V Symposium, International archives of photogrammetry, remote sensing and spatial information sciences, vol. 36, pp. 136–141. Institute of Photogrammetry and Remote Sensing, University of Technology, Dresden (2006)
Khoshelham, K., Elberink, S.O.: Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors 12(2), 1437–1454 (2012). https://doi.org/10.3390/s120201437, http://www.mdpi.com/1424-8220/12/2/1437
Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, ISMAR ’07, pp. 1–10. IEEE Computer Society, Washington, DC, USA (2007). https://doi.org/10.1109/ISMAR.2007.4538852
Kolb, A., Barth, E., Koch, R., Larsen, R.: Time-of-flight sensors in computer graphics. In: Pauly, M., Greiner, G. (eds.) Eurographics 2009 - State of the Art Reports. The Eurographics Association, Aire-la-Ville (2009). https://doi.org/10.2312/egst.20091064
Konolige, K., Agrawal, M., Bolles, R.C., Cowan, C., Fischler, M., Gerkey, B.: Outdoor Mapping and Navigation Using Stereo Vision, pp. 179–190. Springer, Berlin (2008). https://doi.org/10.1007/978-3-540-77457-0_17
Konolige, K., Bowman, J.: Towards lifelong visual maps. In: 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1156–1163 (2009). https://doi.org/10.1109/IROS.2009.5354121
Kuffner, J.J.: Effective sampling and distance metrics for 3D rigid body path planning. In: 2004 IEEE International Conference on Robotics and Automation (ICRA), pp. 3993–3998. New Orleans, United States (2004)
Lange, R.: 3d time-of-flight distance measurement with custom solid-state image sensors in cmos/ccd-technology (2000). http://dokumentix.ub.uni-siegen.de/opus/volltexte/2006/178
Lu, F., Milios, E.: Globally consistent range scan alignment for environment mapping. Auton. Robot. 4(4), 333–349 (1997). https://doi.org/10.1023/A:1008854305733
Magnusson, M., Andreasson, H., Nüchter, A., Lilienthal, A.J.: Automatic appearance-based loop detection from three-dimensional laser data using the normal distributions transform. J. Field Robot. 26(11–12), 892–914 (2009). https://doi.org/10.1002/rob.20314
May, S., Droeschel, D., Fuchs, S., Holz, D., Nüchter, A.: Robust 3d-mapping with time-of-flight cameras. In: 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1673–1678 (2009). https://doi.org/10.1109/IROS.2009.5354684
May, S., Droeschel, D., Holz, D., Fuchs, S., Malis, E., Nüchter, A., Hertzberg, J.: Three-dimensional mapping with time-of-flight cameras. J. Field Robot. 26(11–12), 934–965 (2009). https://doi.org/10.1002/rob.20321
Montemerlo, M., Thrun, S., Koller, D., Wegbreit, B.: FastSLAM: a factored solution to the simultaneous localization and mapping problem. In: 18th National Conference on Artificial Intelligence, pp. 593–598. American Association for Artificial Intelligence, Menlo Park, CA, USA (2002). http://dl.acm.org/citation.cfm?id=777092.777184
Nister, D.: Preemptive RANSAC for live structure and motion estimation. In: Proceedings Ninth IEEE International Conference on Computer Vision, vol. 1, pp. 199–206 (2003). https://doi.org/10.1109/ICCV.2003.1238341
Nüchter, A., Lingemann, K., Hertzberg, J., Surmann, H.: 6D SLAM - 3D mapping outdoor environments: research articles. J. Field Robot. 24(8–9), 699–722 (2007). https://doi.org/10.1002/rob.v24:8/9
Nüchter, A., Lingemann, K.: Robotic 3D Scan Repository. http://kos.informatik.uos.de/3Dscans/ (2017)
Pollefeys, M., Gool, L.V.: From images to 3D models. Commun. ACM 45(7), 50–55 (2002). https://doi.org/10.1145/514236.514263
Pomerleau, F., Magnenat, S., Colas, F., Liu, M., Siegwart, R.: Tracking a depth camera: parameter exploration for fast ICP. In: 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3824–3829 (2011). https://doi.org/10.1109/IROS.2011.6094861
Reynolds, M., Dobo, J., Peel, L., Weyrich, T., Brostow, G.J.: Capturing time-of-flight data with confidence. CVPR 2011, 945–952 (2011). https://doi.org/10.1109/CVPR.2011.5995550
Robbins, S., Schroeder, B., Murawski, B., Heckman, N., Leung, J.: Photogrammetric calibration of the SwissRanger 3D range imaging sensor. In: Proceedings of the SPIE Optical Sensors, vol. 7003, p. 700320 (2008). https://doi.org/10.1117/12.781551
Segal, A., Haehnel, D., Thrun, S.: Generalized-ICP. In: Proceedings of Robotics: Science and Systems. Seattle, USA (2009). https://doi.org/10.15607/RSS.2009.V.021
Fuchs, S.: DLR institute of robotics and mechatronics: 3D mapping with ToF cameras. http://kos.informatik.uos.de/3Dscans/ (2017)
Strasdat, H., Montiel, J., Davison, A.J.: Scale drift-aware large scale monocular SLAM. In: Robotics: Science and Systems VI. The MIT Press, Cambridge (2010)
Stühmer, J., Gumhold, S., Cremers, D.: Real-Time Dense Geometry from a Handheld Camera, pp. 11–20. Springer, Berlin (2010). https://doi.org/10.1007/978-3-642-15986-2_2
Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D SLAM systems. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 573–580. IEEE, Algarve (2012)
Sturm, J., Magnenat, S., Engelhard, N., Pomerleau, F., Colas, F., Cremers, D., Siegwart, R., Burgard, W.: Towards a benchmark for RGB-D SLAM evaluation. In: RGB-D Workshop on Advanced Reasoning with Depth Cameras at Robotics: Science and Systems Conference (RSS). Los Angeles, United States (2011). https://hal.archives-ouvertes.fr/hal-01142608
Tamas, L., Jensen, B.: Robustness analysis of 3d feature descriptors for object recognition using a time-of-flight camera. In: 22nd Mediterranean Conference on Control and Automation, pp. 1020–1025 (2014). https://doi.org/10.1109/MED.2014.6961508
Vivet, D., Gérossier, F., Checchin, P., Trassoudaine, L., Chapuis, R.: Mobile ground-based radar sensor for localization and mapping: an evaluation of two approaches. Int. J. Adv. Robot. Syst. 10(307), 12 (2013). https://doi.org/10.5772/56636
Vivet, D., Checchin, P., Chapuis, R.: Localization and mapping using only a rotating FMCW radar sensor. Sensors 13(4), 4527–4552 (2013). https://doi.org/10.3390/s130404527, http://www.mdpi.com/1424-8220/13/4/4527
Weingarten, J.W., Gruener, G., Siegwart, R.: A state-of-the-art 3d sensor for robot navigation. In: 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), vol. 3, pp. 2155–2160 (2004). https://doi.org/10.1109/IROS.2004.1389728
Ye, C., Bruch, M.: A visual odometry method based on the swissranger sr4000. In: Technical report, Arkansas University at Little rock, Little Rock (2010)
Acknowledgements
This work is supported by the French government research program Investissements d’Avenir through the RobotEx Equipment of Excellence (ANR-10-EQPX-44) and the IMobS3 Laboratory of Excellence (ANR-10-LABX-16-01), by the European Union through the program Regional competitiveness and employment 2007–2013 (ERDF - Auvergne region), by the Auvergne region.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Ghorpade, V.K., Borrmann, D., Checchin, P., Malaterre, L., Trassoudaine, L. (2020). Time-of-Flight Depth Datasets for Indoor Semantic SLAM. In: Amato, N., Hager, G., Thomas, S., Torres-Torriti, M. (eds) Robotics Research. Springer Proceedings in Advanced Robotics, vol 10. Springer, Cham. https://doi.org/10.1007/978-3-030-28619-4_48
Download citation
DOI: https://doi.org/10.1007/978-3-030-28619-4_48
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-28618-7
Online ISBN: 978-3-030-28619-4
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)