skip to main content
10.1145/3472749.3474775acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

VEmotion: Using Driving Context for Indirect Emotion Prediction in Real-Time

Published: 12 October 2021 Publication History

Editorial Notes

The authors have requested minor, non-substantive changes to the VoR and, in accordance with ACM policies, a Corrected VoR was published on October 14, 2021. For reference purposes the VoR may still be accessed via the Supplemental Material section on this page.

Abstract

Detecting emotions while driving remains a challenge in Human-Computer Interaction. Current methods to estimate the driver’s experienced emotions use physiological sensing (e.g., skin-conductance, electroencephalography), speech, or facial expressions. However, drivers need to use wearable devices, perform explicit voice interaction, or require robust facial expressiveness. We present VEmotion (Virtual Emotion Sensor), a novel method to predict driver emotions in an unobtrusive way using contextual smartphone data. VEmotion analyzes information including traffic dynamics, environmental factors, in-vehicle context, and road characteristics to implicitly classify driver emotions. We demonstrate the applicability in a real-world driving study (N = 12) to evaluate the emotion prediction performance. Our results show that VEmotion outperforms facial expressions by 29% in a person-dependent classification and by 8.5% in a person-independent classification. We discuss how VEmotion enables empathic car interfaces to sense the driver’s emotions and will provide in-situ interface adaptations on-the-go.

Supplementary Material

VTT File (p638-video_preview.vtt)
3474775-vor (3474775-vor.pdf)
Version of Record for "VEmotion: Using Driving Context for Indirect Emotion Prediction in Real-Time" by Bethge et al., The 34th Annual ACM Symposium on User Interface Software and Technology (UIST '21).
MP4 File (p638-video_preview.mp4)
Video preview and captions

References

[1]
S. M. Alarcão and M. J. Fonseca. 2019. Emotions Recognition Using EEG Signals: A Survey. IEEE Transactions on Affective Computing 10, 3 (2019), 374–393. https://doi.org/10.1109/TAFFC.2017.2714671
[2]
Victor M Álvarez, Claudia N Sánchez, Sebastián Gutiérrez, Julieta Domínguez-Soberanes, and Ramiro Velázquez. 2018. Facial emotion recognition: a comparison of different landmark-based classifiers. In 2018 International Conference on Research in Intelligent and Computing in Engineering (RICE). IEEE, 1–4.
[3]
Nazanin Andalibi and Justin Buss. 2020. The human in emotion recognition on social media: Attitudes, outcomes, risks. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–16.
[4]
Michael Braun, Jingyi Li, Florian Weber, Bastian Pfleging, Andreas Butz, and Florian Alt. 2020. What If Your Car Would Care? Exploring Use Cases For Affective Automotive User Interfaces. In 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services (Oldenburg, Germany) (MobileHCI ’20). Association for Computing Machinery, New York, NY, USA, Article 37, 12 pages. https://doi.org/10.1145/3379503.3403530
[5]
Michael Braun, Florian Weber, and Florian Alt. [n.d.]. Affective Automotive User Interfaces - Reviewing the State of Emotion Regulation in the Car. In To appear in ACM Copmuting Surveys.
[6]
Leo Breiman. 2001. Random forests. Machine learning 45, 1 (2001), 5–32.
[7]
Yang Cai. 2006. Empathic computing. In Ambient Intelligence in Everyday Life. Springer, 67–85.
[8]
Delphine Caruelle, Anders Gustafsson, Poja Shams, and Line Lervik-Olsen. 2019. The use of electrodermal activity (EDA) measurement to understand consumer emotions – A literature review and a call for action. Journal of Business Research 104 (2019), 146–160. https://doi.org/10.1016/j.jbusres.2019.06.041
[9]
Silvia Ceccacci, Maura Mengoni, Generosi Andrea, Luca Giraldi, Giuseppe Carbonara, Andrea Castellano, and Roberto Montanari. 2020. A Preliminary Investigation Towards the Application of Facial Expression Analysis to Enable an Emotion-Aware Car Interface. In Universal Access in Human-Computer Interaction. Applications and Practice, Margherita Antona and Constantine Stephanidis (Eds.). Springer International Publishing, Cham, 504–517. https://doi.org/10.1007/978-3-030-49108-6_36
[10]
Marie Connolly. 2013. Some like it mild and not too wet: The influence of weather on subjective well-being. Journal of Happiness Studies 14, 2 (2013), 457–473. https://doi.org/10.1007/s10902-012-9338-2
[11]
Monique Dittrich and Sebastian Zepf. 2019. Exploring the validity of methods to track emotions behind the wheel. In International Conference on Persuasive Technology. Springer, 115–127. https://doi.org/10.1007/978-3-030-17287-9_10
[12]
Maria Egger, Matthias Ley, and Sten Hanke. 2019. Emotion Recognition from Physiological Signal Analysis: A Review. Electronic Notes in Theoretical Computer Science 343 (2019), 35–55. https://doi.org/10.1016/j.entcs.2019.04.009 The proceedings of AmI, the 2018 European Conference on Ambient Intelligence.
[13]
Maria Egger, Matthias Ley, and Sten Hanke. 2019. Emotion recognition from physiological signal analysis: a review. Electronic Notes in Theoretical Computer Science 343 (2019), 35–55. https://doi.org/10.1016/j.entcs.2019.04.009
[14]
Paul Ekman. 1984. Expression and the nature of emotion. Approaches to emotion 3, 19 (1984), 344.
[15]
Paul Ekman. 1992. Are there basic emotions?(1992). https://doi.org/10.1037/0033-295X.99.3.550
[16]
Paul Ekman. 1993. Facial expression and emotion.American psychologist 48, 4 (1993), 384. https://doi.org/10.1037/0003-066X.48.4.384
[17]
Paul Ekman, Wallace V Friesen, Maureen O’sullivan, Anthony Chan, Irene Diacoyanni-Tarlatzis, Karl Heider, Rainer Krause, William Ayhan LeCompte, Tom Pitcairn, Pio E Ricci-Bitti, 1987. Universals and cultural differences in the judgments of facial expressions of emotion.Journal of personality and social psychology 53, 4(1987), 712. https://doi.org/10.1037/0022-3514.53.4.712
[18]
Rosenberg Ekman. 1997. What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). Oxford University Press, USA.
[19]
H. Gao, A. Yüce, and J. Thiran. 2014. Detecting emotional stress from facial expressions for driving safety. In 2014 IEEE International Conference on Image Processing (ICIP). 5961–5965. https://doi.org/10.1109/ICIP.2014.7026203
[20]
Deborah Gould. 2010. On affect and protest. In Political emotions. Routledge, 32–58.
[21]
Michael Grimm, Kristian Kroschel, Helen Harris, Clifford Nass, Björn Schuller, Gerhard Rigoll, and Tobias Moosmayr. 2007. On the Necessity and Feasibility of Detecting a Driver’s Emotional State While Driving. In Affective Computing and Intelligent Interaction, Ana C. R. Paiva, Rui Prada, and Rosalind W. Picard (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 126–138. https://doi.org/10.1007/978-3-540-74889-2_12
[22]
GM Hancock, PA Hancock, and CM Janelle. 2012. The impact of emotions and predominant emotion regulation technique on driving performance. Work 41, Supplement 1 (2012), 3608–3611. https://doi.org/10.3233/WOR-2012-0666-3608
[23]
Javier Hernandez, Daniel McDuff, Xavier Benavides, Judith Amores, Pattie Maes, and Rosalind Picard. 2014. AutoEmotive: Bringing Empathy to the Driving Experience to Manage Stress. In Proceedings of the 2014 Companion Publication on Designing Interactive Systems (Vancouver, BC, Canada) (DIS Companion ’14). Association for Computing Machinery, New York, NY, USA, 53–56. https://doi.org/10.1145/2598784.2602780
[24]
Ozgur Karaduman, Haluk Eren, Hasan Kurum, and Mehmet Celenk. 2013. An effective variable selection algorithm for Aggressive/Calm Driving detection via CAN bus. In 2013 International Conference on Connected Vehicles and Expo (ICCVE). IEEE, 586–591. https://doi.org/10.1109/ICCVE.2013.6799859
[25]
Armağan Karahanoğlu and Çiğdem Erbuğ. 2011. Perceived Qualities of Smart Wearables: Determinants of User Acceptance. In Proceedings of the 2011 Conference on Designing Pleasurable Products and Interfaces(Milano, Italy) (DPPI ’11). Association for Computing Machinery, New York, NY, USA, Article 26, 8 pages. https://doi.org/10.1145/2347504.2347533
[26]
A. Kolli, A. Fasih, F. A. Machot, and K. Kyamakya. 2011. Non-intrusive car driver’s emotion recognition using thermal camera. In Proceedings of the Joint INDS’11 ISTET’11. 1–5. https://doi.org/10.1109/INDS.2011.6024802
[27]
Thomas Kosch, Mariam Hassib, Robin Reutter, and Florian Alt. 2020. Emotions on the Go: Mobile Emotion Assessment in Real-Time Using Facial Expressions. In Proceedings of the International Conference on Advanced Visual Interfaces (Salerno, Italy) (AVI ’20). Association for Computing Machinery, New York, NY, USA, Article 18, 9 pages. https://doi.org/10.1145/3399715.3399928
[28]
Janina Künecke, Andrea Hildebrandt, Guillermo Recio, Werner Sommer, and Oliver Wilhelm. 2014. Facial EMG responses to emotional expressions are related to emotion perception ability. PloS one 9, 1 (2014), e84053. https://doi.org/10.1371/journal.pone.0084053
[29]
Oliver Langner, Ron Dotsch, Gijsbert Bijlstra, Daniel HJ Wigboldus, Skyler T Hawk, and AD Van Knippenberg. 2010. Presentation and validation of the Radboud Faces Database. Cognition and emotion 24, 8 (2010), 1377–1388.
[30]
Y. Lin, H. Leng, G. Yang, and H. Cai. 2007. An Intelligent Noninvasive Sensor for Driver Pulse Wave Measurement. IEEE Sensors Journal 7, 5 (2007), 790–799. https://doi.org/10.1109/JSEN.2007.894923
[31]
Zhiyi Ma, Marwa Mahmoud, Peter Robinson, Eduardo Dias, and Lee Skrypchuk. 2017. Automatic Detection of a Driver’s Complex Mental States. In Computational Science and Its Applications, Osvaldo Gervasi, Beniamino Murgante, Sanjay Misra, Giuseppe Borruso, Carmelo M. Torre, Ana Maria A.C. Rocha, David Taniar, Bernady O. Apduhan, Elena Stankova, and Alfredo Cuzzocrea (Eds.). Springer International Publishing, Cham, 678–691. https://doi.org/10.1007/978-3-319-62398-6_48
[32]
L. Malta, C. Miyajima, N. Kitaoka, and K. Takeda. 2011. Analysis of Real-World Driver’s Frustration. IEEE Transactions on Intelligent Transportation Systems 12, 1(2011), 109–118. https://doi.org/10.1109/TITS.2010.2070839
[33]
E. Massaro, C. Ahn, C. Ratti, P. Santi, R. Stahlmann, A. Lamprecht, M. Roehder, and M. Huber. 2017. The Car as an Ambient Sensing Platform [Point of View]. Proc. IEEE 105, 1 (2017), 3–7. https://doi.org/10.1109/JPROC.2016.2634938
[34]
Daniel McDuff, Abdelrahman Mahmoud, Mohammad Mavadati, May Amr, Jay Turcot, and Rana el Kaliouby. 2016. AFFDEX SDK: A Cross-Platform Real-Time Multi-Face Expression Recognition Toolkit. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (San Jose, California, USA) (CHI EA ’16). Association for Computing Machinery, New York, NY, USA, 3723–3726. https://doi.org/10.1145/2851581.2890247
[35]
Meital Navon and Orit Taubman – Ben-Ari. 2019. Driven by emotions: The association between emotion regulation, forgivingness, and driving styles. Transportation Research Part F: Traffic Psychology and Behaviour 65 (2019), 1–9. https://doi.org/10.1016/j.trf.2019.07.005
[36]
Michael Oehl, Felix W. Siebert, Tessa-Karina Tews, Rainer Höger, and Hans-Rüdiger Pfister. 2011. Improving Human-Machine Interaction – A Non Invasive Approach to Detect Emotions in Car Drivers. In Human-Computer Interaction. Towards Mobile and Intelligent Interaction Environments, Julie A. Jacko (Ed.). Springer Berlin Heidelberg, Berlin, Heidelberg, 577–585.
[37]
Michal Olszanowski, Grzegorz Pochwatko, Krzysztof Kuklinski, Michal Scibor-Rylski, Peter Lewinski, and Rafal K Ohme. 2015. Warsaw set of emotional facial expression pictures: a validation study of facial display photographs. Frontiers in psychology 5 (2015), 1516. https://doi.org/10.3389/fpsyg.2014.01516
[38]
M. Paschero, G. Del Vescovo, L. Benucci, A. Rizzi, M. Santello, G. Fabbri, and F. M. F. Mascioli. 2012. A real time classifier for emotion and stress recognition in a vehicle driver. In 2012 IEEE International Symposium on Industrial Electronics. 1690–1695. https://doi.org/10.1109/ISIE.2012.6237345
[39]
Rosalind W Picard. 2000. Affective computing. MIT press.
[40]
Daniel S. Quintana, Adam J. Guastella, Tim Outhred, Ian B. Hickie, and Andrew H. Kemp. 2012. Heart rate variability is associated with emotion recognition: Direct evidence for a relationship between the autonomic nervous system and social cognition. International Journal of Psychophysiology 86, 2 (2012), 168–172. https://doi.org/10.1016/j.ijpsycho.2012.08.012
[41]
Genaro Rebolledo-Mendez, Angelica Reyes, Sebastian Paszkowicz, Mari Carmen Domingo, and Lee Skrypchuk. 2014. Developing a body sensor network to detect emotions during driving. IEEE transactions on intelligent transportation systems 15, 4(2014), 1850–1854. https://doi.org/10.1109/TITS.2014.2335151
[42]
Andreas Riener, Alois Ferscha, and Mohamed Aly. 2009. Heart on the road: HRV analysis for monitoring a driver’s affective state. In Proceedings of the 1st international conference on automotive user interfaces and interactive vehicular applications. 99–106. https://doi.org/10.1145/1620509.1620529
[43]
James A Russell. 1994. Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies.Psychological bulletin 115, 1 (1994), 102.
[44]
Kashfia Sailunaz, Manmeet Dhaliwal, Jon Rokne, and Reda Alhajj. 2018. Emotion detection from text and speech: a survey. Social Network Analysis and Mining 8, 1 (2018), 1–26. https://doi.org/10.1007/s13278-018-0505-2
[45]
Karen L Schmidt and Jeffrey F Cohn. 2001. Dynamics of facial expression: Normative characteristics and individual differences. In IEEE International Conference on Multimedia and Expo, 2001. ICME 2001. Citeseer, 547–550. https://doi.org/10.1109/ICME.2001.1237778
[46]
B. W. Schuller. 2008. Speaker, Noise, and Acoustic Space Adaptation for Emotion Recognition in the Automotive Environment. In ITG Conference on Voice Communication [8. ITG-Fachtagung]. 1–4.
[47]
Amazon Web Services. [n.d.]. AWS Recognition API. https://docs.aws.amazon.com/rekognition/
[48]
Mimi Sheller. 2004. Automotive Emotions: Feeling the Car. Theory, Culture & Society 21, 4-5 (2004), 221–242. https://doi.org/10.1177/0263276404046068
[49]
Felix W Siebert, Michael Oehl, and HR Pfister. 2010. The measurement of grip-strength in automobiles: A new approach to detect driver’s emotions. Advances in Human Factors, Ergonomics, and Safety in Manufacturing and Service Industries (2010), 775–783.
[50]
A. X. A. Sim and B. Sitohang. 2014. OBD-II standard car engine diagnostic software development. In 2014 International Conference on Data and Software Engineering (ICODSE). 1–5. https://doi.org/10.1109/ICODSE.2014.7062704
[51]
Luke Stark. 2019. Facial recognition is the plutonium of AI. XRDS: Crossroads, The ACM Magazine for Students 25, 3 (2019), 50–55.
[52]
Luke Stark and Jesse Hoey. 2021. The ethics of emotion in artificial intelligence systems. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency. 782–793.
[53]
Ronnie Taib, Jeremy Tederry, and Benjamin Itzstein. 2014. Quantifying Driver Frustration to Improve Road Safety. In CHI ’14 Extended Abstracts on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI EA ’14). Association for Computing Machinery, New York, NY, USA, 1777–1782. https://doi.org/10.1145/2559206.2581258
[54]
Jianhua Tao and Tieniu Tan. 2005. Affective Computing: A Review. In Affective Computing and Intelligent Interaction, Jianhua Tao, Tieniu Tan, and Rosalind W. Picard (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 981–995. https://doi.org/10.1007/11573548_125
[55]
ThoughtWorksArts. [n.d.]. EmoPy - Python Emotion Recognition Toolkit. https://github.com/thoughtworksarts/EmoPy
[56]
Job Van Der Schalk, Skyler T Hawk, Agneta H Fischer, and Bertjan Doosje. 2011. Moving faces, looking places: validation of the Amsterdam Dynamic Facial Expression Set (ADFES). Emotion 11, 4 (2011), 907. https://doi.org/10.1037/a0023853
[57]
Marjolein D van der Zwaag, Joris H Janssen, Clifford Nass, Joyce HDM Westerink, Shrestha Chowdhury, and Dick de Waard. 2013. Using music to change mood while driving. Ergonomics 56, 10 (2013), 1504–1514. https://doi.org/10.1080/00140139.2013.825013
[58]
Kiel von Lindenberg. 2014. Comparative analysis of gps data. Undergraduate Journal of Mathematical Modeling: One+ Two 5, 2(2014), 1. https://doi.org/10.5038/2326-3652.5.2.1
[59]
Heetae Yang, Jieun Yu, Hangjung Zo, and Munkee Choi. 2016. User acceptance of wearable devices: An extended perspective of perceived value. Telematics and Informatics 33, 2 (2016), 256–269. https://doi.org/10.1016/j.tele.2015.08.007
[60]
Kangning Yang, Chaofan Wang, Zhanna Sarsenbayeva, Benjamin Tag, Tilman Dingler, Greg Wadley, and Jorge Goncalves. 2020. Benchmarking commercial emotion detection systems using realistic distortions of facial image datasets. The Visual Computer (2020), 1–20. https://doi.org/10.1007/s00371-020-01881-x
[61]
Sebastian Zepf, Monique Dittrich, Javier Hernandez, and Alexander Schmitt. 2019. Towards empathetic Car interfaces: Emotional triggers while driving. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. 1–6. https://doi.org/10.1145/3290607.3312883
[62]
Sebastian Zepf, Javier Hernandez, Alexander Schmitt, Wolfgang Minker, and Rosalind W Picard. 2020. Driver Emotion Recognition for Intelligent Vehicles: A Survey. ACM Computing Surveys (CSUR) 53, 3 (2020), 1–30. https://doi.org/10.1145/3388790
[63]
Feng Zhou, Yangjian Ji, and Roger J. Jiao. 2014. Augmented Affective-Cognition for Usability Study of In-Vehicle System User Interface. Journal of Computing and Information Science in Engineering 14, 2 (02 2014). https://doi.org/10.1115/1.4026222 arXiv:https://asmedigitalcollection.asme.org/computingengineering/article-pdf/14/2/021001/6099446/jcise_014_02_021001.pdf021001.

Cited By

View all
  • (2025)ProxyLabelExpert Systems with Applications: An International Journal10.1016/j.eswa.2024.125913265:COnline publication date: 18-Feb-2025
  • (2024)Multimodal Dataset Construction and Validation for Driving-Related Anger: A Wearable Physiological Conduction and Vehicle Driving Data ApproachElectronics10.3390/electronics1319390413:19(3904)Online publication date: 2-Oct-2024
  • (2024)AutoTherm: A Dataset and Benchmark for Thermal Comfort Estimation Indoors and in VehiclesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785038:3(1-49)Online publication date: 9-Sep-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UIST '21: The 34th Annual ACM Symposium on User Interface Software and Technology
October 2021
1357 pages
ISBN:9781450386357
DOI:10.1145/3472749
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 October 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. contextual affective state prediction
  2. driver emotion detection
  3. machine learning
  4. mobile sensory system

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

UIST '21

Acceptance Rates

Overall Acceptance Rate 561 of 2,567 submissions, 22%

Upcoming Conference

UIST '25
The 38th Annual ACM Symposium on User Interface Software and Technology
September 28 - October 1, 2025
Busan , Republic of Korea

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)144
  • Downloads (Last 6 weeks)18
Reflects downloads up to 08 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)ProxyLabelExpert Systems with Applications: An International Journal10.1016/j.eswa.2024.125913265:COnline publication date: 18-Feb-2025
  • (2024)Multimodal Dataset Construction and Validation for Driving-Related Anger: A Wearable Physiological Conduction and Vehicle Driving Data ApproachElectronics10.3390/electronics1319390413:19(3904)Online publication date: 2-Oct-2024
  • (2024)AutoTherm: A Dataset and Benchmark for Thermal Comfort Estimation Indoors and in VehiclesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785038:3(1-49)Online publication date: 9-Sep-2024
  • (2024)Portobello: Extending Driving Simulation from the Lab to the RoadProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642341(1-13)Online publication date: 11-May-2024
  • (2024)Review and Perspectives on Human Emotion for Connected Automated VehiclesAutomotive Innovation10.1007/s42154-023-00270-z7:1(4-44)Online publication date: 30-Jan-2024
  • (2024)·AI-enabled intelligent cockpit proactive affective interaction: middle-level feature fusion dual-branch deep learning network for driver emotion recognitionAdvances in Manufacturing10.1007/s40436-024-00519-8Online publication date: 4-Sep-2024
  • (2023)Utilising Emotion Monitoring for Developing Music Interventions for People with Dementia: A State-of-the-Art ReviewSensors10.3390/s2313583423:13(5834)Online publication date: 22-Jun-2023
  • (2023)Pilots' Considerations Regarding Current Generation Mixed Reality Headset Use in General Aviation CockpitsProceedings of the 22nd International Conference on Mobile and Ubiquitous Multimedia10.1145/3626705.3627785(159-165)Online publication date: 3-Dec-2023
  • (2023)Travel Experience in Public Transport: A Geospatial Analysis by Experience MappingProceedings of Mensch und Computer 202310.1145/3603555.3608537(417-421)Online publication date: 3-Sep-2023
  • (2023)A Demonstration of AutoVis: Enabling Mixed-Immersive Analysis of Automotive User Interface Interaction StudiesAdjunct Proceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3581961.3610374(279-282)Online publication date: 18-Sep-2023
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media