Abstract
Simulations are gaining increasingly significance in the field of autonomous driving due to the demand for rapid prototyping and extensive testing. Employing physics-based simulation brings several benefits at an affordable cost, while mitigating potential risks to prototypes, drivers, and vulnerable road users. However, there exit two primary limitations. Firstly, the reality gap which refers to the disparity between reality and simulation and prevents the simulated autonomous driving systems from having the same performance in the real world. Secondly, the lack of empirical understanding regarding the behavior of real agents, such as backup drivers or passengers, as well as other road users such as vehicles, pedestrians, or cyclists. Agent simulation is commonly implemented through deterministic or randomized probabilistic pre-programmed models, or generated from real-world data; but it fails to accurately represent the behaviors adopted by real agents while interacting within a specific simulated scenario. This paper extends the description of our proposed framework to enable real-time interaction between real agents and simulated environments, by means immersive virtual reality and human motion capture systems within the CARLA simulator for autonomous driving. We have designed a set of usability examples that allow the analysis of the interactions between real pedestrians and simulated autonomous vehicles and we provide a first measure of the user’s sensation of presence in the virtual environment.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Kalra, N., Paddock, S.M.: Driving to safety: how many miles of driving would it take to demonstrate autonomous vehicle reliability? RAND Corporation, Research report (2016)
Fernández-Llorca, D., Gómez, E.: Trustworthy artificial intelligence requirements in the autonomous driving domain. Computer 56(2), 29–39 (2023). https://doi.org/10.1109/MC.2022.3212091
Izquierdo Gonzalo, R., Salinas Maldonado, C., Alonso Ruiz, J., Parra Alonso, I., Fernández Llorca, D., Sotelo, M.Á.: Testing predictive automated driving systems: lessons learned and future recommendations. IEEE Intell. Transp. Syst. Mag. 14(6), 77–93 (2022). https://doi.org/10.1109/MITS.2022.3170649
Stocco, A., Pulfer, B., Tonella, P.: Mind the gap! a study on the transferability of virtual vs physical-world testing of autonomous driving systems. IEEE Trans. Softw. Eng., 1–13 (2022). https://doi.org/10.1109/TSE.2022.3202311
García Daza, I., Izquierdo, R., Martínez, L.M., Benderius, O., Fernández Llorca, D.: Sim-to-real transfer and reality gap modeling in model predictive control for autonomous driving. Appl. Intell. 53, 12719–12735 (2022). https://doi.org/10.1007/s10489-022-04148-1
Eady, T.: Simulations can’t solve autonomous driving because they lack important knowledge about the real world - large-scale real world data is the only way (2019). https://medium.com/@trenteady/simulation-cant-solve-autonomous-driving-because-it-lacks-necessary-empirical-knowledge-403feeec15e0. Accessed 8 Apr 2022
Dosovitskiy, A., Ros, G., Codevilla, F., Lopez, A., Koltun, V.: CARLA: an open Urban driving simulator. In: Proceedings of the 1st Annual Conference on Robot Learning, pp. 1–16 (2017)
Martín Serrano, S., Fernández Llorca, D., García Daza, I., Sotelo, M.A.: Insertion of real agents behaviors in CARLA autonomous driving simulator. In: Proceedings of the 6th International Conference on Computer-Human Interaction Research and Applications, CHIRA 2022, pp. 23–31 (2022). https://doi.org/10.5220/0011352400003323
Menolotto, M., Komaris, D., Tedesco, S., O’Flynn, B., Walsh, M.: Motion capture technology in industrial applications: a systematic review. Sensors 20(19), 5687 (2020). https://doi.org/10.3390/s20195687
Noitom (2022). https://neuronmocap.com/perception-neuron-studio-system. Accessed 22 Mar 2023
Movard (2023). https://www.movard.es/productos/xsens/. Accessed 22 Mar 2023
Serrano, S.M, Izquierdo, R., García Daza, I., Sotelo, M.Á., Fernández Llorca, D.: Digital twin in virtual reality for human-vehicle interactions in the context of autonomous driving (2023). arXiv:2303.11463
OpenStreetMap. https://www.openstreetmap.org/about. Accessed 23 Mar 2023
Acknowledgements
This work was funded by Research Grants PID2020-114924RB-I00 and PDC2021-121324-I00 (Spanish Ministry of Science and Innovation) and partially by S2018/EMT-4362 SEGVAUTO 4.0-CM (Community of Madrid). D. Fernández Llorca acknowledges funding from the HUMAINT project by the Directorate-General Joint Research Centre of the European Commission.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Ethics declarations
Disclaimer
The views expressed in this article are purely those of the authors and may not, under any circumstances, be regarded as an official position of the European Commission.
Appendix A
Appendix A
1.1 Self-presence Scale Items
To what extent did you feel that\(\ldots \) (1 = not at all - 5 very strongly)
-
1.
You could move the avatar’s hands.
-
2.
The avatar’s displacement was your own displacement.
-
3.
The avatar’s body was your own body.
-
4.
If something happened to the avatar, it was happening to you.
-
5.
The avatar was you.
1.2 Autonomous Vehicle Presence Scale Items
To what extent did you feel that\(\ldots \) (1 = not at all - 5 very strongly)
-
1.
The vehicle was present.
-
2.
The vehicle dynamics and its movement were natural.
-
3.
The sound of the vehicle helped you to locate it.
-
4.
The vehicle was aware of your presence.
-
5.
The vehicle was real.
1.3 Environmental Presence Scale Items
To what extent did you feel that\(\ldots \) (1 = not at all - 5 very strongly)
-
1.
You were really in front of a pedestrian crossing.
-
2.
The road signs and traffic lights were real.
-
3.
You really crossed the pedestrian crossing.
-
4.
The urban environment seemed like the real world.
-
5.
It could reach out and touch the objects in the urban environment.
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Serrano, S.M., Llorca, D.F., Daza, I.G., Sotelo, M.Á. (2023). Realistic Pedestrian Behaviour in the CARLA Simulator Using VR and Mocap. In: Holzinger, A., da Silva, H.P., Vanderdonckt, J., Constantine, L. (eds) Computer-Human Interaction Research and Applications. CHIRA CHIRA 2021 2022. Communications in Computer and Information Science, vol 1882. Springer, Cham. https://doi.org/10.1007/978-3-031-41962-1_5
Download citation
DOI: https://doi.org/10.1007/978-3-031-41962-1_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-41961-4
Online ISBN: 978-3-031-41962-1
eBook Packages: Computer ScienceComputer Science (R0)