Abstract
In the latter years, we have been observing a growth in wearable technology for personal use. However, an analysis of the state of the art for wearable technology shows that most devices perform data acquisition from individual subjects only, relying on communication technologies with drawbacks that prevent their use in collective real-world scenarios (e.g. a cinema, a theatre, and related use cases). When analysing the emotional response in groups, two types of emotions appear: individual (influenced by the group) and group-based emotions (towards the group as an identity). To fill the existing gap, we propose a biocybernetic engine for real-time data acquisition of multimodal physiological data in real-world scenarios. Our system extends the state of the art with: (1) real-time data acquisition for the signals being acquired (20 devices at 25 Hz; 10 devices at 60 Hz); (2) creation of a standalone local infrastructure with end-user interface for monitoring the data acquisition; (3) local and cloud-based data storage. We foresee that this platform could be the basis for the creation of large databases in diverse real-world scenarios, namely health and wellbeing, marketing, art performances, and others. As a result, this work will greatly contribute to simplify widespread biosignals data collection from unobtrusive wearables. To evaluate the system, we report a comprehensive assessment based on a set of criteria for data quality analysis.









Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
References
Levenson RW (2014) The autonomic nervous system and emotion. Emotion Review 6(2):100–112. https://doi.org/10.1177/1754073913512003
Bota PJ, Wang C, Fred ALN, Silva HPD (2019) A review, current challenges, and future possibilities on emotion recognition using machine learning and physiological signals. IEEE Access 7:140990–141020. https://doi.org/10.1109/ACCESS.2019.2944001
Plácido da Silva H (2019) Biomedical sensors as invisible doctors. In: Regenerative design in digital practice: a handbook fothe built environment. Eurac Research, pp 322–329
Ausín JL, Duque-Carrillo JF, Ramos J, Torelli G (2013) In: Mukhopadhyay, S.C., Postolache, O.A. (eds.) From Handheld Devices to Near-invisible Sensors: The Road to Pervasive e-Health, pp. 135–156. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32538-0_6
Wlodarczyk A, Zumeta L, Pizarro JJ, Bouchat P, Hatibovic F, Basabe N, Rimé B (2020) Perceived emotional synchrony in collective gatherings: Validation of a short scale and proposition of an integrative measure. Frontiers in Psychology 11:1721. https://doi.org/10.3389/fpsyg.2020.01721
von Scheve C, Ismer S (2013) Towards a theory of collective emotions. Emotion Review 5(4):406–413. https://doi.org/10.1177/1754073913484170
Guerreiro J, Lourenço A, Silva H, Fred A (2013) Performance comparison of low-cost hardware platforms targeting physiological computing applications. Procedia Technology 17, 399–406 (2014). https://doi.org/10.1016/j.protcy.2014.10.204. Conference on Electronics, Telecommunications and Computers - CETC
Wang C (May 2018) Monitoring the engagement of groups by using physiological sensors. PhD thesis, Vrije Universiteit Amsterdam
Batista D, Silva H, Fred ALN, Moreira CF, Ferreira HF (2019) Benchmarking of the BITalino biomedical toolkit against an established gold standard. Healthcare Technology Letters 6(2):32–36. https://doi.org/10.1049/htl.2018.5037
Milstein N, Gordon I (2020) Validating measures of electrodermal activity and heart rate variability derived from the empatica e4 utilized in research settings that involve interactive dyadic states. Frontiers in Behavioral Neuroscience 14. https://doi.org/10.3389/fnbeh.2020.00148
Schmidt P, Reiss A, Dürichen R, Laerhoven KV (2019) Wearable-based affect recognition-a review. Sensors (Basel, Switzerland) 19(19):4079
Saganowski S, Kazienko P, Dziezyc M, Jakimów P, Komoszynska J, Michalska W, Dutkowiak A, Polak AG, Dziadek A, Ujma M (2020) Review of consumer wearables in emotion, stress, meditation, sleep, and activity detection and analysis. CoRR abs/2005.00093
Abreu M, Fred A, Plácido da Silva H, Wang C (2020) From seizure detection to prediction: A review of wearables and related devices applicable to epilepsy via peripheral measurements. https://doi.org/10.13140/RG.2.2.17428.45447
Bota PJ, Wang C, Fred ALN, Silva H (2020) A wearable system for electrodermal activity data acquisition in collective experience assessment. In: Filipe, J., Smialek, M., Brodsky, A., Hammoudi, S. (eds.) Proc. of the Int’l Conf. on Enterprise Information Systems, Prague, Czech Republic, May 5-7, 2020, Volume 2, pp. 606–613. SCITEPRESS, ???. https://doi.org/10.5220/0009816906060613
Koelstra S, Muhl C, Soleymani M, Lee J, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) Deap: A database for emotion analysis using physiological signals. IEEE Trans. on Affective Computing 3(1):18–31. https://doi.org/10.1109/T-AFFC.2011.15
Soleymani M, Lichtenauer J, Pun T, Pantic M (2012) A multimodal database for affect recognition and implicit tagging. IEEE Trans. on Affective Computing 3(1):42–55. https://doi.org/10.1109/T-AFFC.2011.25
Carvalho S, Leite J, Galdo-Álvarez S, Gonçalves ÓF (2012) The emotional movie database (EMDB): A self-report and psychophysiological study. Applied Psychophysiology and Biofeedback 37(4):279–294
Miranda-Correa JA, Abadi MK, Sebe N, Patras I (2017) AMIGOS: A dataset for affect, personality and mood research on individuals and groups. IEEE Trans Affect Comput 12(2):479–493. https://doi.org/10.1109/TAFFC.2018.2884461
Schmidt P, Reiss A, Duerichen R, Marberger C, Van Laerhoven K (2018) Introducing WESAD, a multimodal dataset for wearable stress and affect detection. In: Proc. Int’l Conf. on Multimodal Interaction, pp. 400–408. Association for Computing Machinery, NY, USA. https://doi.org/10.1145/3242969.3242985
Park CY, Cha N, Kang S, Kim A, Khandoker AH, Hadjileontiadis L, Oh A, Jeong Y, Lee U (2020) K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations. Scientific Data 7(1):293
Sharma K, Castellini C, Stulp F, van den Broek EL (2020) Continuous, real-time emotion annotation: A novel joystick-based analysis framework. IEEE Trans. on Affective Computing 11(1):78–84. https://doi.org/10.1109/TAFFC.2017.2772882
Markova V, Ganchev T, Kalinkov K (2019) Clas: A database for cognitive load, affect and stress recognition. In: Proc. Int’l Conf. on Biomedical Innovations and Applications, pp. 1–4. https://doi.org/10.1109/BIA48344.2019.8967457
Larradet F, Niewiadomski R, Barresi G, Caldwell DG, Mattos LS (2020) Toward emotion recognition from physiological signals in the wild: Approaching the methodological issues in real-life data collection. Frontiers in Psychology 11:1111. https://doi.org/10.3389/fpsyg.2020.01111
Madgwick SOH, Harrison AJL, Vaidyanathan R (2011) Estimation of imu and marg orientation using a gradient descent algorithm. In: IEEE Int’l Conf. on Rehabilitation Robotics, pp. 1–7. https://doi.org/10.1109/ICORR.2011.5975346
Sadowski S, Spachos P (2018) Rssi-based indoor localization with the internet of things. IEEE Access 6:30149–30161. https://doi.org/10.1109/ACCESS.2018.2843325
Acknowledgements
This work was partially funded by Fundação para a Ciência e Tecnologia (FCT) under the grant 2020.06675.BD and under the projects PCIF/SSO/0163/2019 “SafeFire” and DSAIPA/AI/0122/2020 “AIM Health”, by the FCT/Ministério da Ciência, Tecnologia e Ensino Superior (MCTES) through national funds and when applicable co-funded by EU funds under the project UIDB/50008/2020, and by the IT - Instituto de Telecomunicações.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors have no competing interests to declare that are relevant to the content of this article.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix A
Appendix A
Rights and permissions
About this article
Cite this article
Bota, P., Flety, E., Silva, H.P.d. et al. EmotiphAI: a biocybernetic engine for real-time biosignals acquisition in a collective setting. Neural Comput & Applic 35, 5721–5736 (2023). https://doi.org/10.1007/s00521-022-07191-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-022-07191-8