Skip to main content

Intelligent Human-Computer Interaction Interface: A Bibliometric Analysis of 2010–2022

  • Conference paper
  • First Online:
Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management (HCII 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14029))

Included in the following conference series:

Abstract

Intelligent interfaces play an important role in the harmony and naturalness of human-computer interaction. The purpose of this paper is to investigate the hot spots and trends in the field of intelligent human-computer interaction interfaces (IHCII) from 2010 to 2022 by bibliometric analysis. Author, citation, co-citation, and keyword co-occurrence networks were visualized using bibliometrics. The analysis included 1,784 articles and 80,964 cited references. The results showed that emotion recognition and EEG are at the forefront of IHCII research. China leads in publications (359), but the US dominates in citations (7,007 times). The Centre National de la Recherche Scient fique is the most productive organization. IEEE Access is the journal with the most papers on IHCII. The keyword co-occurrence analysis shows that “user experience”, “virtual reality”, “eye tracking”, “emotion recognition”, “big data”, and “mental workload” may be the research hotspots in this field. For researchers, this paper proposes that interface design features are a research gap in the field of IHCII.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Nielsen, J.: User interface directions for the Web. Commun. ACM. 42, 65–72 (1999). https://doi.org/10.1145/291469.291470

    Article  Google Scholar 

  2. Interface Design for the Command-control Module Based on Adaptive Interaction Technology

    Google Scholar 

  3. Laureano-Cruces, A.L., Sánchez-Guerrero, L., Ramírez-Rodríguez, J., Ramírez-Laureano, E.: Intelligent interfaces: pedagogical agents and virtual humans. Int. J. Intell. Sci. 12, 57–78 (2022). https://doi.org/10.4236/ijis.2022.123005

    Article  Google Scholar 

  4. Lim, Y., et al.: Avionics human-machine interfaces and interactions for manned and unmanned aircraft. Prog. Aeosp. Sci. 102, 1–46 (2018). https://doi.org/10.1016/j.paerosci.2018.05.002

    Article  Google Scholar 

  5. Van Velsen, L., Van Der Geest, T., Klaassen, R., Steehouder, M.: User-centered evaluation of adaptive and adaptable systems: a literature review. Knowl. Eng. Rev. 23, 261–281 (2008). https://doi.org/10.1017/S0269888908001379

    Article  Google Scholar 

  6. Ulahannan, A., Jennings, P., Oliveira, L., Birrell, S.: Designing an adaptive interface: using eye tracking to classify how information usage changes over time in partially automated vehicles. IEEE Access 8, 16865–16875 (2020). https://doi.org/10.1109/ACCESS.2020.2966928

    Article  Google Scholar 

  7. Wang, Z., et al.: The role of user-centered AR instruction in improving novice spatial cognition in a high-precision procedural task. Adv. Eng. Inform. 47, 101250 (2021). https://doi.org/10.1016/j.aei.2021.101250

    Article  Google Scholar 

  8. Karpov, A.A., Yusupov, R.M.: Multimodal interfaces of human-computer interaction. Her. Russ. Acad. Sci. 88, 67–74 (2018). https://doi.org/10.1134/S1019331618010094

    Article  Google Scholar 

  9. Dibeklioğlu, H., Surer, E., Salah, A.A., Dutoit, T.: Behavior and usability analysis for multimodal user interfaces. J. Multimodal User Interfaces 15(4), 335–336 (2021). https://doi.org/10.1007/s12193-021-00372-0

    Article  Google Scholar 

  10. Wang, M., et al.: Fusing stretchable sensing technology with machine learning for human-machine interfaces. Adv. Funct. Mater. 31, 2008807 (2021). https://doi.org/10.1002/adfm.202008807

    Article  Google Scholar 

  11. Tan, H., Sun, J., Wenjia, W., Zhu, C.: User experience & usability of driving: a bibliometric analysis of 2000–2019. Int. J. Hum. Comput. Interact. 37, 297–307 (2021). https://doi.org/10.1080/10447318.2020.1860516

    Article  Google Scholar 

  12. Hassenzahl, M., Diefenbach, S., Göritz, A.: Needs, affect, and interactive products – facets of user experience. Interact. Comput. 22, 353–362 (2010). https://doi.org/10.1016/j.intcom.2010.04.002

    Article  Google Scholar 

  13. Teyssier, M., Bailly, G., Pelachaud, C., Lecolinet, E.: Conveying emotions through device-initiated touch. IEEE Trans. Affect. Comput. 13, 1477–1488 (2022). https://doi.org/10.1109/TAFFC.2020.3008693

    Article  Google Scholar 

  14. Ferguson, C., van den Broek, E.L., van Oostendorp, H.: On the role of interaction mode and story structure in virtual reality serious games. Comput. Educ. 143, 103671 (2020). https://doi.org/10.1016/j.compedu.2019.103671

    Article  Google Scholar 

  15. Liu, C.-C., Liao, M.-G., Chang, C.-H., Lin, H.-M.: An analysis of children’ interaction with an AI chatbot and its impact on their interest in reading. Comput. Educ. 189, 104576 (2022). https://doi.org/10.1016/j.compedu.2022.104576

    Article  Google Scholar 

  16. Conati, C., Lallé, S., Rahman, M.A., Toker, D.: Comparing and combining interaction data and eye-tracking data for the real-time prediction of user cognitive abilities in visualization tasks. ACM Trans. Interact. Intell. Syst. 10, 12:1–12:41 (2020). https://doi.org/10.1145/3301400

  17. Zhang, T., Li, S., Chen, B., Yuan, H., Chen, C.L.P.: AIA-Net: adaptive interactive attention network for text–audio emotion recognition. IEEE Trans. Cybern. 1–13 (2022). https://doi.org/10.1109/TCYB.2022.3195739

  18. Ayari, N., Abdelkawy, H., Chibani, A., Amirat, Y.: Hybrid model-based emotion contextual recognition for cognitive assistance services. IEEE Trans. Cybern. 52, 3567–3576 (2022). https://doi.org/10.1109/TCYB.2020.3013112

    Article  Google Scholar 

  19. Berrezueta-Guzman, J., Pau, I., Martín-Ruiz, M.-L., Máximo-Bocanegra, N.: Smart-home environment to support homework activities for children. IEEE Access 8, 160251–160267 (2020). https://doi.org/10.1109/ACCESS.2020.3020734

    Article  Google Scholar 

  20. Lv, Z.: Virtual reality in the context of Internet of Things. Neural Comput. Appl. 32(13), 9593–9602 (2019). https://doi.org/10.1007/s00521-019-04472-7

    Article  Google Scholar 

  21. Wang, Q., Yang, S., Liu, M., Cao, Z., Ma, Q.: An eye-tracking study of website complexity from cognitive load perspective. Decis. Support Syst. 62, 1 (2014). https://doi.org/10.1016/j.dss.2014.02.007

    Article  Google Scholar 

  22. Pillai, P., Balasingam, B., Kim, Y.H., Lee, C., Biondi, F.: Eye-gaze metrics for cognitive load detection on a driving simulator. IEEE-ASME Trans. Mechatron. 27, 2134–2141 (2022). https://doi.org/10.1109/TMECH.2022.3175774

    Article  Google Scholar 

  23. Jenke, R., Peer, A., Buss, M.: Feature extraction and selection for emotion recognition from EEG. IEEE Trans. Affective Comput. 5, 327–339 (2014). https://doi.org/10.1109/TAFFC.2014.2339834

    Article  Google Scholar 

  24. Cambria, E.: Affective computing and sentiment analysis. IEEE Intell. Syst. 6 (2016)

    Google Scholar 

  25. Barricelli, B.R., Casiraghi, E., Fogli, D.: A survey on digital twin: definitions, characteristics, applications, and design implications. IEEE Access 7, 167653–167671 (2019). https://doi.org/10.1109/ACCESS.2019.2953499

    Article  Google Scholar 

  26. Parasuraman, R., Manzey, D.H.: Complacency and bias in human use of automation: an attentional integration. Hum Factors 52, 381–410 (2010). https://doi.org/10.1177/0018720810376055

    Article  Google Scholar 

  27. Small, H.: Co-citation in the scientific literature: a new measure of the relationship between two documents. J. Am. Soc. Inf. Sci. 24, 265–269 (1973). https://doi.org/10.1002/asi.4630240406

    Article  Google Scholar 

  28. Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31, 39–58 (2009). https://doi.org/10.1109/TPAMI.2008.52

    Article  Google Scholar 

  29. Katsigiannis, S., Ramzan, N.: DREAMER: a database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. IEEE J. Biomed. Health Inform. 22, 98–107 (2018). https://doi.org/10.1109/JBHI.2017.2688239

    Article  Google Scholar 

  30. Poria, S., Cambria, E., Bajpai, R., Hussain, A.: A review of affective computing: from unimodal analysis to multimodal fusion. Information Fusion. 37, 98–125 (2017). https://doi.org/10.1016/j.inffus.2017.02.003

    Article  Google Scholar 

  31. Guo, F., Li, F., Lv, W., Liu, L., Duffy, V.G.: Bibliometric analysis of affective computing researches during 1999–2018. Int. J. Hum. Comput. Interact. 36, 801–814 (2020). https://doi.org/10.1080/10447318.2019.1688985

    Article  Google Scholar 

  32. Wu, D., Xu, Y., Lu, B.-L.: Transfer learning for EEG-based brain-computer interfaces: a review of progress made since 2016. IEEE Trans. Cogn. Dev. Syst. 14, 4–19 (2022). https://doi.org/10.1109/TCDS.2020.3007453

    Article  Google Scholar 

  33. Gunes, H., Schuller, B.: Categorical and dimensional affect analysis in continuous input: current trends and future directions. Image Vis. Comput. 31, 120–136 (2013). https://doi.org/10.1016/j.imavis.2012.06.016

    Article  Google Scholar 

  34. Kothe, C.A., Makeig, S.: BCILAB: a platform for brain–computer interface development. J. Neural Eng. 10, 056014 (2013). https://doi.org/10.1088/1741-2560/10/5/056014

    Article  Google Scholar 

  35. Koelstra, S., et al.: DEAP: a database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3, 18–31 (2012). https://doi.org/10.1109/T-AFFC.2011.15

    Article  Google Scholar 

  36. Huang, M., Zhu, X., Gao, J.: Challenges in building intelligent open-domain dialog systems. ACM Trans. Inf. Syst. 38, 21 (2020). https://doi.org/10.1145/3383123

    Article  Google Scholar 

  37. Jiang, Y., Li, W., Hossain, M.S., Chen, M., Alelaiwi, A., Al-Hammadi, M.: A snapshot research and implementation of multimodal information fusion for data-driven emotion recognition. Inf. Fusion. 53, 209–221 (2020). https://doi.org/10.1016/j.inffus.2019.06.019

    Article  Google Scholar 

  38. Huang, G., Liu, Z., Van Der Maaten, L., Weinberger, K.Q.: Densely connected convolutional networks. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2261–2269 (2017). https://doi.org/10.1109/CVPR.2017.243

  39. Mehmood, R.M., Du, R., Lee, H.J.: Optimal feature selection and deep learning ensembles method for emotion recognition from human brain EEG sensors. IEEE Access 5, 14797–14806 (2017). https://doi.org/10.1109/ACCESS.2017.2724555

    Article  Google Scholar 

  40. Atkinson, J., Campos, D.: Improving BCI-based emotion recognition by combining EEG feature selection and kernel classifiers. Expert Syst. Appl. 47, 35–41 (2016). https://doi.org/10.1016/j.eswa.2015.10.049

    Article  Google Scholar 

  41. Mustaqeem, K.S.: Optimal feature selection based speech emotion recognition using two-stream deep convolutional neural network. Int. J. Intell. Syst. 36, 5116–5135 (2021). https://doi.org/10.1002/int.22505

  42. Chen, C.: Searching for intellectual turning points: progressive knowledge domain visualization. Proc. Natl. Acad. Sci. 101, 5303–5310 (2004). https://doi.org/10.1073/pnas.0307513100

    Article  Google Scholar 

  43. Kleinsmith, A., Bianchi-Berthouze, N.: Affective body expression perception and recognition: a survey. IEEE Trans. Affect. Comput. 4, 15–33 (2013). https://doi.org/10.1109/T-AFFC.2012.16

    Article  Google Scholar 

  44. Liu, W., Zheng, W.-L., Lu, B.-L.: Emotion Recognition Using Multimodal Deep Learning. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds.) ICONIP 2016. LNCS, vol. 9948, pp. 521–529. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46672-9_58

    Chapter  Google Scholar 

  45. Dzedzickis, A., Kaklauskas, A., Bucinskas, V.: Human emotion recognition: review of sensors and methods. Sensors 20, 592 (2020). https://doi.org/10.3390/s20030592

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by National Natural Science Foundation of Anhui Province (grant number 2208085MG183), the Key Project for Natural Science Fund of Colleges in Anhui Province (grant numbers KJ2021A0502), and the Project for Social Science Innovation and Development in Anhui Province (grant numbers 2021CX075). Further, we thank the editor and anonymous reviewers for their valuable comments and advice.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yaqin Cao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, Y., Cao, Y., Liu, Y., Hu, X. (2023). Intelligent Human-Computer Interaction Interface: A Bibliometric Analysis of 2010–2022. In: Duffy, V.G. (eds) Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management. HCII 2023. Lecture Notes in Computer Science, vol 14029. Springer, Cham. https://doi.org/10.1007/978-3-031-35748-0_40

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35748-0_40

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35747-3

  • Online ISBN: 978-3-031-35748-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics