Skip to main content

Driver Stress Detection in Simulated Driving Scenarios with Photoplethysmography

  • Conference paper
  • First Online:
Distributed Computing and Artificial Intelligence, 19th International Conference (DCAI 2022)

Abstract

The great advances in technology have allowed the development of portable devices capable of monitoring different physiological measures in an inexpensive, non-invasive and efficient way. Virtual Reality (VR) has also evolved achieving a very realistic immersive experience in different contexts. The combination of signal acquisition devices and VR makes it possible to generate useful knowledge even in challenging situations of daily life, such as when driving. The treatment through artificial intelligence techniques allows the development of systems for the recognition of vital emotions to control human health and safety. The present work investigates the feasibility of detecting stress in individuals using physiological signals collected with a photoplethysmography (PPG) sensor incorporated into a commonly used wristwatch. The characteristics acquired during an immersive model through VR simulation are taken as input to a trained model by Machine Learning (ML) algorithms. This model performs the driver stress detection and the high precision classification in real time. By means of several immersive experiments, the validation of the proposed system is checked through the Heart Rate Average (HRV) for the identification of driver stress.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Leap Motion: https://www.ultraleap.com/product/leap-motion-controller/.

  2. 2.

    Unreal Engine 4: https://www.unrealengine.com/en-US/.

  3. 3.

    HTC Vive: https://www.vive.com/mx/product/vive-pro/.

References

  1. Haouij, N.E., Poggi, J.M., Sevestre-Ghalila, S., Ghozi, R., Jaïdane, M.: AffectiveROAD system and database to assess driver’s attention. In: Proceedings of the 33rd Annual ACM Symposium on Applied Computing, pp. 800–803 (April)

    Google Scholar 

  2. Nguyen, J., Powers, S.T., Urquhart, N., Farrenkopf, T., Guckert, M.: Using AGADE traffic to analyse purpose-driven travel behaviour. In: International Conference on Practical Applications of Agents and Multi-Agent Systems, pp. 363–366. Springer, Cham (2021)

    Google Scholar 

  3. Brookhuis, K.A., De Waard, D.: Monitoring drivers’ mental workload in driving simulators using physiological measures. Accid. Anal. Prevent. 42(3), 898–903 (2010)

    Article  Google Scholar 

  4. Gao, Z., Li, C., Hu, H., Zhao, H., Chen, C., Yu, H.: Experimal study of young male drivers’ responses to vehicle collision using EMG of lower extremity. Bio-Med. Mater. Eng. 26(s1), S563–S573 (2015)

    Article  Google Scholar 

  5. Kajiwara, S.: Evaluation of driver’s mental workload by facial temperature and electrodermal activity under simulated driving conditions. Int. J. Automot. Technol. 15(1), 65–70 (2014)

    Article  Google Scholar 

  6. Cardone, D., Perpetuini, D., Filippini, C., Spadolini, E., Mancini, L., Chiarelli, A.M., Merla, A.: Driver stress state evaluation by means of thermal imaging: a supervised machine learning approach based on ECG signal. Appl. Sci. 10(16), 5673 (2020)

    Article  Google Scholar 

  7. Ali, M., Mosa, A.H., Machot, F.A., Kyamakya, K.: Emotion recognition involving physiological and speech signals: a comprehensive review. Recent Advances in Nonlinear Dynamics and Synchronization, pp. 287–302 (2018)

    Google Scholar 

  8. Yan, L., Wan, P., Qin, L., Zhu, D.: The induction and detection method of angry driving: evidences from EEG and physiological signals. Discrete Dynamics in Nature and Society (2018). https://doi.org/10.1155/2018/3702795

  9. Zero, E., Bersani, C., Zero, L., Sacile, R.: Towards real-time monitoring of fear in driving sessions. IFAC-PapersOnLine 52(19), 299–304 (2019)

    Article  Google Scholar 

  10. Elaraby, N., Bolock, A.E., Herbert, C., Abdennadher, S.: Anxiety Detection During COVID-19 Using the character computing ontology. In: Practical Applications of Agents and Multi-Agent Systems, pp. 5–16. Springer, Cham (2021)

    Google Scholar 

  11. Delgado, C., López, D.M., Rico-Olarte, C.: Affective video games: a systematic mapping study. In: International Conference on Human-Computer Interaction, pp. 105–113. Springer, Cham (2019)

    Google Scholar 

  12. Li, B.J., Bailenson, J.N., Pines, A., Greenleaf, W.J., Williams, L.M.: A public database of immersive VR videos with corresponding ratings of arousal, valence, and correlations between head movements and self report measures. Front. Psychol. 8, 2116 (2017)

    Article  Google Scholar 

  13. Granato, M., Gadia, D., Maggiorini, D., Ripamonti, L.A.: Feature extraction and selection for real-time emotion recognition in video games players. In: 2018 14th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS), pp. 717–724. IEEE (2018

    Google Scholar 

  14. Casado-Vara, R., Prieto-Castrillo, F., Corchado, J.M.: A game theory approach for cooperative control to improve data quality and false data detection in WSN. Int. J. Robust Nonlinear Control 28(16), 5087–5102 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  15. Gruenewald, A., Kroenert, D., Poehler, J., Brueck, R., Li, F., Littau, J., ... Niehaves, B.: [Regular Paper] Biomedical data acquisition and processing to recognize emotions for affective learning. In: 2018 IEEE 18th International Conference on Bioinformatics and Bioengineering (BIBE), pp. 126–132. IEEE (2018)

    Google Scholar 

  16. Basarslan, M.S., Kayaalp, F.: Sentiment analysis with machine learning methods on social media. ADCAIJ: Adv. Distrib. Comput. Artif. Intell. J. 9(3), 5 (2020)

    Google Scholar 

  17. Hsu, J.L., Zhen, Y.L., Lin, T.C., Chiu, Y.S.: Affective content analysis of music emotion through EEG. Multimedia Syst. 24(2), 195–210 (2018)

    Article  Google Scholar 

  18. Seo, J., Laine, T.H., Sohn, K.A.: An exploration of machine learning methods for robust boredom classification using EEG and GSR data. Sensors 19(20), 4561 (2019)

    Article  Google Scholar 

  19. Yang, H., Han, J., Min, K.: A multi-column CNN model for emotion recognition from EEG signals. Sensors 19(21), 4736 (2019)

    Article  Google Scholar 

  20. Khan, R., Siddiqui, S., Rastogi, A.: Crime detection using sentiment analysis. ADCAIJ: Adv. Distrib. Comput. Artif. Intell. J. 10(3), 281–291

    Google Scholar 

  21. Cho, J.Y., Kim, K.B., Hwang, W.S., Yang, C.H., Ahn, J.H., Do Hong, S., ... Sung, T.H.: A multifunctional road-compatible piezoelectric energy harvester for autonomous driver-assist LED indicators with a self-monitoring system. Appl. Energy 242, 294–301 (2019)

    Google Scholar 

  22. Ranjan, R., AK, D.: A proposed hybrid model for sentiment classification using CovNet-DualL STM techniques. ADCAIJ: Adv. Distrib. Comput. Artif. Intell. J. 10(4), 401–418

    Google Scholar 

  23. Artífice, A., Ferreira, F., Marcelino-Jesus, E., Sarraipa, J., Jardim-Gonçalves, R.: Student’s attention improvement supported by physiological measurements analysis. In: Doctoral Conference on Computing, Electrical and Industrial Systems, pp. 93–102. Springer, Cham (2017)

    Google Scholar 

  24. Zhang, K., Zhang, H., Li, S., Yang, C., Sun, L.: The pmemo dataset for music emotion recognition. In: Proceedings of the 2018 ACM on International Conference on Multimedia Retrieval, pp. 135–142 (2018)

    Google Scholar 

  25. Rivera, H., Valadão, C., Caldeira, E., Krishnan, S., Bastos-Filho, T.F.: Development of a toolkit for online analysis of facial emotion. In: XXVI Brazilian Congress on Biomedical Engineering, pp. 619–625. Springer, Singapore (2019)

    Google Scholar 

  26. Lozano-Monasor, E., López, M.T., Vigo-Bustos, F., Fernández-Caballero, A.: Facial expression recognition in ageing adults: from lab to ambient assisted living. J. Ambient. Intell. Humaniz. Comput. 8(4), 567–578 (2017). https://doi.org/10.1007/s12652-017-0464-x

    Article  Google Scholar 

  27. Ousmane, A.M., Djara, T., Vianou, A.: Automatic recognition system of emotions expressed through the face using machine learning: Application to police interrogation simulation. In: 2019 3rd International Conference on Bio-engineering for Smart Technologies (BioSMART), pp. 1–4. IEEE (2019)

    Google Scholar 

  28. Rabhi, Y., Mrabet, M., Fnaiech, F.: A facial expression controlled wheelchair for people with disabilities. Comput. Methods Programs Biomed. 165, 89–105 (2018)

    Article  Google Scholar 

  29. Chickerur, S., Patil, M.S., Anand, M.E.T.I., Nabapure, P.M., Mahindrakar, S., Sonali, N.A.I.K., Kanyal, S.: LSTM based lip reading approach for devanagiri script. ADCAIJ: Adv. Distrib. Comput. Artif. Intell. J. 8(3), 13 (2019)

    Google Scholar 

  30. Gupta, S., Meena, J., Gupta, O.P.: Neural network based epileptic EEG detection and classification. ADCAIJ: Adv. Distrib. Comput. Artif. Intell. J. 9(2), 23 (2020)

    Google Scholar 

  31. Cominelli, L., Carbonaro, N., Mazzei, D., Garofalo, R., Tognetti, A., De Rossi, D.: A multimodal perception framework for users emotional state assessment in social robotics. Future Internet 9(3), 42 (2017)

    Article  Google Scholar 

  32. Senturk, Z.K., Bakay, M.S.: Machine learning based hand gesture recognition via EMG data. ADCAIJ: Adv. Distrib. Comput. Artif. Intell. J. 10(2), 123–136 (2021)

    Google Scholar 

  33. Domínguez-Jiménez, J.A., Campo-Landines, K.C., Martínez-Santos, J.C., Delahoz, E.J., Contreras-Ortiz, S.H.: A machine learning model for emotion recognition from physiological signals. Biomed. Signal Process. Control 55, 101646 (2020)

    Article  Google Scholar 

  34. Pinto, J., Fred, A., da Silva, H.P.: Biosignal-based multimodal emotion recognition in a valence-arousal affective framework applied to immersive video visualization. In: 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 3577–3583. IEEE (2019)

    Google Scholar 

  35. Gouverneur, P., Jaworek-Korjakowska, J., Köping, L., Shirahama, K., Kleczek, P., Grzegorzek, M.: Classification of physiological data for emotion recognition. In: International Conference on Artificial Intelligence and Soft Computing, pp. 619–627. Springer, Cham (2017)

    Google Scholar 

  36. Hassani, S., Bafadel, I., Bekhatro, A., Al Blooshi, E., Ahmed, S., Alahmad, M.: Physiological signal-based emotion recognition system. In: 2017 4th IEEE International Conference on Engineering Technologies and Applied Sciences (ICETAS), pp. 1–5. IEEE (2017)

    Google Scholar 

  37. Montesinos, V., Dell’Agnola, F., Arza, A., Aminifar, A., Atienza, D.: Multi-modal acute stress recognition using off-the-shelf wearable devices. In: 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 2196–2201. IEEE (2019

    Google Scholar 

  38. Birjandtalab, J., Cogan, D., Pouyan, M.B., Nourani, M.: A non-EEG biosignals dataset for assessment and visualization of neurological status. In: 2016 IEEE International Workshop on Signal Processing Systems (SiPS), pp. 110–114. IEEE (2016)

    Google Scholar 

  39. Zhao, B., Wang, Z., Yu, Z., Guo, B.: EmotionSense: emotion recognition based on wearable wristband. In: 2018 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI), pp. 346–355. IEEE (2018)

    Google Scholar 

  40. Hovsepian, K., Al’Absi, M., Ertin, E., Kamarck, T., Nakajima, M., Kumar, S.: cStress: towards a gold standard for continuous stress assessment in the mobile environment. In: Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, pp. 493–504 (2015)

    Google Scholar 

  41. Eudave, L., Valencia, M.: Physiological response while driving in an immersive virtual environment. In: 2017 IEEE 14th International Conference on Wearable and Implantable Body Sensor Networks (BSN), pp. 145–148. IEEE (2017)

    Google Scholar 

  42. Perello-March, J.R., Burns, C.G., Woodman, R., Elliott, M.T., Birrell, S.A.: Driver state monitoring: Manipulating reliability expectations in simulated automated driving scenarios. IEEE Trans. Intell. Transp. Syst. (2021)

    Google Scholar 

  43. Healey, J.A., Picard, R.W.: Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans. Intell. Transp. Syst. 6(2), 156–166 (2005)

    Article  Google Scholar 

  44. Zhang, Z., Song, Y., Cui, L., Liu, X., Zhu, T.: Emotion recognition based on customized smart bracelet with built-in accelerometer. PeerJ 4, e2258 (2016)

    Article  Google Scholar 

  45. Akbulut, F.P., Ikitimur, B., Akan, A.: Wearable sensor-based evaluation of psychosocial stress in patients with metabolic syndrome. Artif. Intell. Med. 104, 101824 (2020)

    Article  Google Scholar 

  46. Yang, H., Han, J., Min, K.: Distinguishing emotional responses to photographs and artwork using a deep learning-based approach. Sensors 19(24), 5533 (2019)

    Article  Google Scholar 

  47. Casado-Vara, R., Novais, P., Gil, A.B., Prieto, J., Corchado, J.M.: Distributed continuous-time fault estimation control for multiple devices in IoT networks. IEEE Access 7, 11972–11984 (2019)

    Article  Google Scholar 

Download references

Acknowledgments

This research has been supported by the project RTI2018-095390-B-C32 (MCIU/AEI/FEDER, UE).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ana B. Gil-González .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Mateos-García, N., Gil-González, A.B., Reboredo, A.d.L., Pérez-Lancho, B. (2023). Driver Stress Detection in Simulated Driving Scenarios with Photoplethysmography. In: Omatu, S., Mehmood, R., Sitek, P., Cicerone, S., Rodríguez, S. (eds) Distributed Computing and Artificial Intelligence, 19th International Conference. DCAI 2022. Lecture Notes in Networks and Systems, vol 583. Springer, Cham. https://doi.org/10.1007/978-3-031-20859-1_29

Download citation

Publish with us

Policies and ethics