Skip to main content

Toward an Interoperable Catalogue of Multimodal Depression-Related Data

  • Conference paper
Applied Intelligence and Informatics (AII 2022)

Abstract

The need to establish an intelligent tool for semi-automatic diagnosis of depression on high-quality data, requires trustworthy, interoperable and multimodal data repositories. Such databases should be based on common collection and storage criteria and should enable advanced and open analysis modes without sacrificing data privacy. This paper launches the Depressive Disorder DataBase (D3B) initiative for the definition of a distributed and interoperable network of databases for the collection and the actual usage of depression-related multimodal data. In this database network, multimedia as voice, video, handwriting, and EEG signals could be collected and shared to a considerable community of researchers. This paper focuses on the catalogue schema, on technical-level details and on the mechanisms able to guarantee interoperability and privacy at the same time.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://data.ctdata.org/dataset/mental-health.

  2. 2.

    https://menhir-project.eu/.

  3. 3.

    http://www.empathic-project.eu/.

  4. 4.

    https://www.fon.hum.uva.nl/praat/.

  5. 5.

    https://github.com/stefanomarrone/d3b..

References

  1. de Aguiar Neto, F.S., Rosa, J.L.G.: Depression biomarkers using non-invasive EEG: a review. Neurosci. Biobehav. Rev. 105, 83–93 (2019)

    Google Scholar 

  2. Alghowinem, S., et al.: Multimodal depression detection: fusion analysis of paralinguistic, head pose and eye gaze behaviors. IEEE Trans. Affect. Comput. 9(4), 478–490 (2018). https://doi.org/10.1109/TAFFC.2016.2634527

    Article  Google Scholar 

  3. Alghowinem, S., Goecke, R., Wagner, M., Parkerx, G., Breakspear, M.: Head pose and movement analysis as an indicator of depression. In: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, pp. 283–288. IEEE (2013)

    Google Scholar 

  4. Aloshban, N., Esposito, A., Vinciarelli, A.: Language or paralanguage, this is the problem: comparing depressed and non-depressed speakers through the analysis of gated multimodal units. In: Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH, vol. 2, pp. 751–755 (2021). https://doi.org/10.21437/Interspeech.2021-928

  5. American Psychiatric Association: Diagnostic and statistical manual of mental disorders: DSM-5 (2013)

    Google Scholar 

  6. Anderson, I., et al.: State-dependent alteration in face emotion recognition in depression. Br. J. Psychiatry 198(4), 302–308 (2011). https://doi.org/10.1192/bjp.bp.110.078139

    Article  Google Scholar 

  7. Beck, A.T., Steer, R.A., Brown, G.K.: Beck Depression Inventory (BDI-II), vol. 10. Pearson (1996)

    Google Scholar 

  8. Bottesi, G., Ghisi, M., Altoè, G., Conforti, E., Melli, G., Sica, C.: The Italian version of the depression anxiety stress scales-21: factor structure and psychometric properties on community and clinical samples. Compr. Psychiatry 60, 170–181 (2015)

    Article  Google Scholar 

  9. Collaborators, G.: Global, regional, and national incidence, prevalence, and years lived with disability for 354 diseases and injuries for 195 countries and territories, 1990–2017: a systematic analysis for the global burden of disease study 2017 (2018)

    Google Scholar 

  10. Dias, M., Abad, A., Trancoso, I.: Exploring hashing and cryptonet based approaches for privacy-preserving speech emotion recognition, vol. 2018-April, pp. 2057–2061 (2018). https://doi.org/10.1109/ICASSP.2018.8461451

  11. Dowlin, N., Gilad-Bachrach, R., Laine, K., Lauter, K., Naehrig, M., Wernsing, J.: CryptoNets: applying neural networks to encrypted data with high throughput and accuracy, vol. 1, pp. 342–351 (2016)

    Google Scholar 

  12. Garcia-Ceja, E., et al.: Depresjon: a motor activity database of depression episodes in unipolar and bipolar patients, pp. 472–477 (2018)

    Google Scholar 

  13. Garcia-Ceja, E., Riegler, M., Nordgreen, T., Jakobsen, P., Oedegaard, K., Tørresen, J.: Mental health monitoring with multimodal sensing and machine learning: a survey. Pervasive Mob. Comput. 51, 1–26 (2018). https://doi.org/10.1016/j.pmcj.2018.09.003

    Article  Google Scholar 

  14. Garner, S.R., et al.: Weka: the Waikato environment for knowledge analysis. In: Proceedings of the New Zealand Computer Science Research Students Conference, vol. 1995, pp. 57–64 (1995)

    Google Scholar 

  15. Gundersen, O.: Standing on the feet of giants - reproducibility in AI. AI Mag. 40(4), 9–23 (2019). https://doi.org/10.1609/aimag.v40i4.5185

    Article  Google Scholar 

  16. Gundersen, O., Kjensmo, S.: State of the art: reproducibility in artificial intelligence, pp. 1644–1651 (2018)

    Google Scholar 

  17. Balsters, M.J.H., Krahmer, E.J., Swerts, M.G.J., Vingerhoets, A.J.J.M.: Verbal and nonverbal correlates for depression: a review. Curr. Psychiatry Rev. 8(3), 227–234 (2012)

    Article  Google Scholar 

  18. Joshi, J., Goecke, R., Parker, G., Breakspear, M.: Can body expressions contribute to automatic depression analysis? In: 2013 10th IEEE International Conference on Automatic Face and Gesture Recognition (FG), pp. 1–7. IEEE (2013)

    Google Scholar 

  19. Khaleghi, B., Khamis, A., Karray, F., Razavi, S.: Multisensor data fusion: a review of the state-of-the-art. Inf. Fusion 14(1), 28–44 (2013). https://doi.org/10.1016/j.inffus.2011.08.001

    Article  Google Scholar 

  20. Likforman-Sulem, L., Esposito, A., Faundez-Zanuy, M., Clémençon, S., Cordasco, G.: EMOTHAW: a novel database for emotional state recognition from handwriting and drawing. IEEE Trans. Hum.-Mach. Syst. 47(2), 273–284 (2017)

    Article  Google Scholar 

  21. Liu, R.X., Chen, H., Guo, R.Y., Zhao, D., Liang, W.J., Li, C.P.: Survey on privacy attacks and defenses in machine learning. Ruan Jian Xue Bao/J. Softw. 31(3), 866–892 (2020). https://doi.org/10.13328/j.cnki.jos.005904

  22. Maghraby, A., Ali, H.: Modern standard Arabic mood changing and depression dataset. Data Brief 41, 107999 (2022). https://doi.org/10.1016/j.dib.2022.107999

  23. Maxhuni, A., Muñoz Meléndez, A., Osmani, V., Perez, H., Mayora, O., Morales, E.: Classification of bipolar disorder episodes based on analysis of voice and motor activity of patients. Pervasive Mob. Comput. 31, 50–66 (2016). https://doi.org/10.1016/j.pmcj.2016.01.008

    Article  Google Scholar 

  24. Mundt, J.C., Vogel, A.P., Feltner, D.E., Lenderking, W.R.: Vocal acoustic biomarkers of depression severity and treatment response. Biol. Psychiat. 72(7), 580–587 (2012)

    Article  Google Scholar 

  25. Péron, J., et al.: Major depressive disorder skews the recognition of emotional prosody. Prog. Neuro-Psychopharmacol. Biol. Psychiatry 35(4), 987–996 (2011). https://doi.org/10.1016/j.pnpbp.2011.01.019

    Article  Google Scholar 

  26. Phillips, M., Drevets, W., Rauch, S., Lane, R.: Neurobiology of emotion perception I: the neural basis of normal emotion perception. Biol. Psychiat. 54(5), 504–514 (2003). https://doi.org/10.1016/S0006-3223(03)00168-9

    Article  Google Scholar 

  27. Poria, S., Mondal, A., Mukhopadhyay, P.: Evaluation of the intricacies of emotional facial expression of psychiatric patients using computational models. In: Mandal, M.K., Awasthi, A. (eds.) Understanding Facial Expressions in Communication, pp. 199–226. Springer, New Delhi (2015). https://doi.org/10.1007/978-81-322-1934-7_10

    Chapter  Google Scholar 

  28. Qiu, R., Kodali, V., Homer, M., Heath, A., Wu, Z., Jia, Y.: Predictive modeling of depression with a large claim dataset, pp. 1589–1595 (2019). https://doi.org/10.1109/BIBM47256.2019.8982975

  29. Savinov, V., Sapunov, V., Shusharina, N., Botman, S., Kamyshov, G., Tynterova, A.: EEG-based depression classification using harmonized datasets, pp. 93–95 (2021). https://doi.org/10.1109/CNN53494.2021.9580293

  30. Scherer, S., et al.: Automatic behavior descriptors for psychological disorder analysis. In: 2013 10th IEEE International Conference on Automatic Face and Gesture Recognition (FG), pp. 1–8. IEEE (2013)

    Google Scholar 

  31. Schneider, D., et al.: Empathic behavioral and physiological responses to dynamic stimuli in depression. Psychiatry Res. 200(2–3), 294–305 (2012). https://doi.org/10.1016/j.psychres.2012.03.054

    Article  Google Scholar 

  32. Scibelli, F., et al.: Depression speaks: automatic discrimination between depressed and non-depressed speakers based on nonverbal speech features, vol. 2018-April, pp. 6842–6846 (2018). https://doi.org/10.1109/ICASSP.2018.8461858

  33. Van Der Aalst, W., Weijters, T., Maruster, L.: Workflow mining: discovering process models from event logs. IEEE Trans. Knowl. Data Eng. 16(9), 1128–1142 (2004). https://doi.org/10.1109/TKDE.2004.47

    Article  Google Scholar 

  34. Verde, L., et al.: A lightweight machine learning approach to detect depression from speech analysis. In: Proceedings of The 33rd IEEE International Conference on Tools with Artificial Intelligence (ICTAI 2021) (2021)

    Google Scholar 

  35. Zhang, L., Driscol, J., Chen, X., Ghomi, R.: Evaluating acoustic and linguistic features of detecting depression sub-challenge dataset, pp. 47–53 (2019). https://doi.org/10.1145/3347320.3357693

Download references

Acknowledgement

The research leading to these results has received funding from the EU H2020 research and innovation program under grant agreement N. 769872 (EMPATHIC) and N. 823907 (MENHIR), the project SIROBOTICS that received funding from Italian MIUR, PNR 2015–2020, D.D. 1735, 13/07/2017, and the project ANDROIDS funded by the program V: ALERE 2019 Universitá della Campania “Luigi Vanvitelli”, D.R. 906 del 4/10/2019, prot. n. 157264, 17/10/2019.

The work of Laura Verde is granted by the “Predictive Maintenance Multidominio (Multidomain predictive maintenance)” project, PON “Ricerca e Innovazione” 2014–2020, Asse IV “Istruzione e ricerca per il recupero”-Azione IV.4-“Dottorati e contratti di ricerca su tematiche dell’innovazione” programme CUP: B61B21005470007.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stefano Marrone .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Cite this paper

Amorese, T. et al. (2022). Toward an Interoperable Catalogue of Multimodal Depression-Related Data. In: Mahmud, M., Ieracitano, C., Kaiser, M.S., Mammone, N., Morabito, F.C. (eds) Applied Intelligence and Informatics. AII 2022. Communications in Computer and Information Science, vol 1724. Springer, Cham. https://doi.org/10.1007/978-3-031-24801-6_27

Download citation

Publish with us

Policies and ethics