Abstract
Hand-held scanners are progressively adopted to workflows on construction sites. Yet, they suffer from accuracy problems, preventing them from deployment for demanding use cases. In this paper, we present a real-world dataset collected periodically on a construction site to measure the accuracy of SLAM algorithms that mobile scanners utilize. The dataset contains time-synchronised and spatially registered images and LiDAR scans, inertial data and professional ground-truth scans. To the best of our knowledge, this is the first publicly available dataset which reflects the periodic need of scanning construction sites with the aim of accurate progress monitoring using a hand-held scanner.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
Please consult www.hesaitech.com.
- 3.
- 4.
- 5.
- 6.
- 7.
- 8.
- 9.
- 10.
- 11.
We used the distance-based downsampling implemented in CloudCompare 2.12.2. See https://www.cloudcompare.org.
- 12.
- 13.
- 14.
References
Behley, J., et al.: SemanticKITTI: a dataset for semantic scene understanding of lidar sequences. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 9297–9307 (2019)
Beltrán, J., Guindel, C., de la Escalera, A., García, F.: Automatic extrinsic calibration method for lidar and camera sensor setups. IEEE Trans. Intell. Transp. Syst. (2022). https://doi.org/10.1109/TITS.2022.3155228
Brown, D.: Decentering distortion of lenses. In: Photogrammetric Engineering, pp. 444–462 (1966)
Caesar, H., et al.: nuScenes: a multimodal dataset for autonomous driving. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 11621–11631 (2020)
Chang, M.F., et al.: Argoverse: 3D tracking and forecasting with rich maps. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 8748–8757 (2019)
Fritsch, J., Kuehnl, T., Geiger, A.: A new performance measure and evaluation benchmark for road detection algorithms. In: 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013), pp. 1693–1700. IEEE (2013)
Gao, B., Pan, Y., Li, C., Geng, S., Zhao, H.: Are we hungry for 3D lidar data for semantic segmentation? A survey and experimental study. arXiv preprint arXiv:2006.04307 (2020)
Geiger, A., Lenz, P., Stiller, C., Urtasun, R.: Vision meets robotics: the KITTI dataset. Int. J. Robot. Res. 32(11), 1231–1237 (2013)
Geiger, A., Lenz, P., Urtasun, R.: Are we ready for autonomous driving? The KITTI vision benchmark suite. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 3354–3361. IEEE (2012)
Griffiths, D., Boehm, J.: SynthCity: a large scale synthetic point cloud. arXiv preprint arXiv:1907.04758 (2019)
Haarbach, A.: Multiview ICP. http://www.adrian-haarbach.de/mv-lm-icp/docs/mv-lm-icp.pdf
Helmberger, M., et al.: The Hilti SLAM challenge dataset. arXiv preprint arXiv:2109.11316 (2021)
Menze, M., Geiger, A.: Object scene flow for autonomous vehicles. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3061–3070 (2015)
Pan, Y., Gao, B., Mei, J., Geng, S., Li, C., Zhao, H.: SemanticPOSS: a point cloud dataset with large quantity of dynamic instances. In: 2020 IEEE Intelligent Vehicles Symposium (IV), pp. 687–693. IEEE (2020)
Richter, S.R., Vineet, V., Roth, S., Koltun, V.: Playing for data: ground truth from computer games. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9906, pp. 102–118. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46475-6_7
Shan, T., Englot, B., Meyers, D., Wang, W., Ratti, C., Daniela, R.: LIO-SAM: tightly-coupled lidar inertial odometry via smoothing and mapping. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5135–5142. IEEE (2020)
Zhang, J., Singh, S.: LOAM: lidar odometry and mapping in real-time. In: Proceedings of Robotics: Science and Systems, Berkeley, USA (2014). https://doi.org/10.15607/RSS.2014.X.007
Acknowledgment
The authors would like to thank Laing O’Rourke for allowing access to their construction site and for collecting the ground truth scans. We also acknowledge Romain Carriquiry-Borchiari of Ubisoft France for his help with rendering some of the figures. This work is supported by the EU Horizon 2020 BIM2TWIN: Optimal Construction Management & Production Control project under an agreement No. 958398. The first author would also like to thank BP, GeoSLAM, Laing O’Rourke, Topcon and Trimble for sponsoring his studentship funding.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Trzeciak, M. et al. (2023). ConSLAM: Periodically Collected Real-World Construction Dataset for SLAM and Progress Monitoring. In: Karlinsky, L., Michaeli, T., Nishino, K. (eds) Computer Vision – ECCV 2022 Workshops. ECCV 2022. Lecture Notes in Computer Science, vol 13807. Springer, Cham. https://doi.org/10.1007/978-3-031-25082-8_21
Download citation
DOI: https://doi.org/10.1007/978-3-031-25082-8_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-25081-1
Online ISBN: 978-3-031-25082-8
eBook Packages: Computer ScienceComputer Science (R0)