Skip to main content

Advertisement

Log in

EmotiphAI: a biocybernetic engine for real-time biosignals acquisition in a collective setting

  • S.I.: Computational-based Biomarkers for Mental and Emotional Health(CBMEH2021)
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In the latter years, we have been observing a growth in wearable technology for personal use. However, an analysis of the state of the art for wearable technology shows that most devices perform data acquisition from individual subjects only, relying on communication technologies with drawbacks that prevent their use in collective real-world scenarios (e.g. a cinema, a theatre, and related use cases). When analysing the emotional response in groups, two types of emotions appear: individual (influenced by the group) and group-based emotions (towards the group as an identity). To fill the existing gap, we propose a biocybernetic engine for real-time data acquisition of multimodal physiological data in real-world scenarios. Our system extends the state of the art with: (1) real-time data acquisition for the signals being acquired (20 devices at 25 Hz; 10 devices at 60 Hz); (2) creation of a standalone local infrastructure with end-user interface for monitoring the data acquisition; (3) local and cloud-based data storage. We foresee that this platform could be the basis for the creation of large databases in diverse real-world scenarios, namely health and wellbeing, marketing, art performances, and others. As a result, this work will greatly contribute to simplify widespread biosignals data collection from unobtrusive wearables. To evaluate the system, we report a comprehensive assessment based on a set of criteria for data quality analysis.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. www.biopac.com/

  2. https://nbtltd.com/products/mindware-mobile-impedance-cardiograph/

  3. sites.google.com/view/emotiw2020

  4. hdfgroup.org/solutions/hdf5/

  5. flask.palletsprojects.com/en/1.1.x/

  6. energia.nu

  7. nime.org/proceedings/2009/nime2009_116.pdf

  8. opensoundcontrol.org/index.html

  9. github.com/BITalinoWorld/riot-python-serverbit

References

  1. Levenson RW (2014) The autonomic nervous system and emotion. Emotion Review 6(2):100–112. https://doi.org/10.1177/1754073913512003

    Article  Google Scholar 

  2. Bota PJ, Wang C, Fred ALN, Silva HPD (2019) A review, current challenges, and future possibilities on emotion recognition using machine learning and physiological signals. IEEE Access 7:140990–141020. https://doi.org/10.1109/ACCESS.2019.2944001

    Article  Google Scholar 

  3. Plácido da Silva H (2019) Biomedical sensors as invisible doctors. In: Regenerative design in digital practice: a handbook fothe built environment. Eurac Research, pp 322–329

  4. Ausín JL, Duque-Carrillo JF, Ramos J, Torelli G (2013) In: Mukhopadhyay, S.C., Postolache, O.A. (eds.) From Handheld Devices to Near-invisible Sensors: The Road to Pervasive e-Health, pp. 135–156. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32538-0_6

  5. Wlodarczyk A, Zumeta L, Pizarro JJ, Bouchat P, Hatibovic F, Basabe N, Rimé B (2020) Perceived emotional synchrony in collective gatherings: Validation of a short scale and proposition of an integrative measure. Frontiers in Psychology 11:1721. https://doi.org/10.3389/fpsyg.2020.01721

    Article  Google Scholar 

  6. von Scheve C, Ismer S (2013) Towards a theory of collective emotions. Emotion Review 5(4):406–413. https://doi.org/10.1177/1754073913484170

    Article  Google Scholar 

  7. Guerreiro J, Lourenço A, Silva H, Fred A (2013) Performance comparison of low-cost hardware platforms targeting physiological computing applications. Procedia Technology 17, 399–406 (2014). https://doi.org/10.1016/j.protcy.2014.10.204. Conference on Electronics, Telecommunications and Computers - CETC

  8. Wang C (May 2018) Monitoring the engagement of groups by using physiological sensors. PhD thesis, Vrije Universiteit Amsterdam

  9. Batista D, Silva H, Fred ALN, Moreira CF, Ferreira HF (2019) Benchmarking of the BITalino biomedical toolkit against an established gold standard. Healthcare Technology Letters 6(2):32–36. https://doi.org/10.1049/htl.2018.5037

    Article  Google Scholar 

  10. Milstein N, Gordon I (2020) Validating measures of electrodermal activity and heart rate variability derived from the empatica e4 utilized in research settings that involve interactive dyadic states. Frontiers in Behavioral Neuroscience 14. https://doi.org/10.3389/fnbeh.2020.00148

  11. Schmidt P, Reiss A, Dürichen R, Laerhoven KV (2019) Wearable-based affect recognition-a review. Sensors (Basel, Switzerland) 19(19):4079

    Article  Google Scholar 

  12. Saganowski S, Kazienko P, Dziezyc M, Jakimów P, Komoszynska J, Michalska W, Dutkowiak A, Polak AG, Dziadek A, Ujma M (2020) Review of consumer wearables in emotion, stress, meditation, sleep, and activity detection and analysis. CoRR abs/2005.00093

  13. Abreu M, Fred A, Plácido da Silva H, Wang C (2020) From seizure detection to prediction: A review of wearables and related devices applicable to epilepsy via peripheral measurements. https://doi.org/10.13140/RG.2.2.17428.45447

  14. Bota PJ, Wang C, Fred ALN, Silva H (2020) A wearable system for electrodermal activity data acquisition in collective experience assessment. In: Filipe, J., Smialek, M., Brodsky, A., Hammoudi, S. (eds.) Proc. of the Int’l Conf. on Enterprise Information Systems, Prague, Czech Republic, May 5-7, 2020, Volume 2, pp. 606–613. SCITEPRESS, ???. https://doi.org/10.5220/0009816906060613

  15. Koelstra S, Muhl C, Soleymani M, Lee J, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) Deap: A database for emotion analysis using physiological signals. IEEE Trans. on Affective Computing 3(1):18–31. https://doi.org/10.1109/T-AFFC.2011.15

    Article  Google Scholar 

  16. Soleymani M, Lichtenauer J, Pun T, Pantic M (2012) A multimodal database for affect recognition and implicit tagging. IEEE Trans. on Affective Computing 3(1):42–55. https://doi.org/10.1109/T-AFFC.2011.25

    Article  Google Scholar 

  17. Carvalho S, Leite J, Galdo-Álvarez S, Gonçalves ÓF (2012) The emotional movie database (EMDB): A self-report and psychophysiological study. Applied Psychophysiology and Biofeedback 37(4):279–294

    Article  Google Scholar 

  18. Miranda-Correa JA, Abadi MK, Sebe N, Patras I (2017) AMIGOS: A dataset for affect, personality and mood research on individuals and groups. IEEE Trans Affect Comput 12(2):479–493. https://doi.org/10.1109/TAFFC.2018.2884461

    Article  Google Scholar 

  19. Schmidt P, Reiss A, Duerichen R, Marberger C, Van Laerhoven K (2018) Introducing WESAD, a multimodal dataset for wearable stress and affect detection. In: Proc. Int’l Conf. on Multimodal Interaction, pp. 400–408. Association for Computing Machinery, NY, USA. https://doi.org/10.1145/3242969.3242985

  20. Park CY, Cha N, Kang S, Kim A, Khandoker AH, Hadjileontiadis L, Oh A, Jeong Y, Lee U (2020) K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations. Scientific Data 7(1):293

    Article  Google Scholar 

  21. Sharma K, Castellini C, Stulp F, van den Broek EL (2020) Continuous, real-time emotion annotation: A novel joystick-based analysis framework. IEEE Trans. on Affective Computing 11(1):78–84. https://doi.org/10.1109/TAFFC.2017.2772882

    Article  Google Scholar 

  22. Markova V, Ganchev T, Kalinkov K (2019) Clas: A database for cognitive load, affect and stress recognition. In: Proc. Int’l Conf. on Biomedical Innovations and Applications, pp. 1–4. https://doi.org/10.1109/BIA48344.2019.8967457

  23. Larradet F, Niewiadomski R, Barresi G, Caldwell DG, Mattos LS (2020) Toward emotion recognition from physiological signals in the wild: Approaching the methodological issues in real-life data collection. Frontiers in Psychology 11:1111. https://doi.org/10.3389/fpsyg.2020.01111

    Article  Google Scholar 

  24. Madgwick SOH, Harrison AJL, Vaidyanathan R (2011) Estimation of imu and marg orientation using a gradient descent algorithm. In: IEEE Int’l Conf. on Rehabilitation Robotics, pp. 1–7. https://doi.org/10.1109/ICORR.2011.5975346

  25. Sadowski S, Spachos P (2018) Rssi-based indoor localization with the internet of things. IEEE Access 6:30149–30161. https://doi.org/10.1109/ACCESS.2018.2843325

    Article  Google Scholar 

Download references

Acknowledgements

This work was partially funded by Fundação para a Ciência e Tecnologia (FCT) under the grant 2020.06675.BD and under the projects PCIF/SSO/0163/2019 “SafeFire” and DSAIPA/AI/0122/2020 “AIM Health”, by the FCT/Ministério da Ciência, Tecnologia e Ensino Superior (MCTES) through national funds and when applicable co-funded by EU funds under the project UIDB/50008/2020, and by the IT - Instituto de Telecomunicações.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Patrícia Bota.

Ethics declarations

Conflict of interest

The authors have no competing interests to declare that are relevant to the content of this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix A

Appendix A

See Tables 2, 3, 4 and 5.

Table 2 Technical specifications for the Zigbee, Bluetooth and WiFi protocolsa b c
Table 3 Router specificationsa b
Table 4 Metrics and brief description of the information extracted from the data for the EmotiphAI data acquisition validation tests
Table 5 Results for EmotiphAI R-IoT validation tests under different configurations

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bota, P., Flety, E., Silva, H.P.d. et al. EmotiphAI: a biocybernetic engine for real-time biosignals acquisition in a collective setting. Neural Comput & Applic 35, 5721–5736 (2023). https://doi.org/10.1007/s00521-022-07191-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-07191-8

Keywords