skip to main content
10.1145/3555776.3577643acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
research-article

Image4Assess: Automatic learning processes recognition using image processing

Published: 07 June 2023 Publication History

Abstract

Recently, there has been a growing interest in improving students' competitiveness in STEM education. Self-reporting and observation are the most used tools for the assessment of STEM education. Despite their effectiveness, such assessment tools face several challenges, such as being labor-intensive and time-consuming, prone to subjective awareness, depending on memory limitations, and being influenced due to social expectations. To address these challenges, in this research, we propose an approach called Image4Assess that---by benefiting from state-of-the-art machine learning like convolutional neural networks and transfer learning---automatically and uninterruptedly assesses students' learning processes during STEM activities using image processing. Our findings reveal that the Image4Assess approach can achieve accuracy, precision, and recall higher than 85% in the learning process recognition of students. This implies that it is feasible to accurately measure the learning process of students in STEM education using their imagery data. We also found that there is a significant correlation between the learning processes automatically identified by our proposed approach and students' post-test, confirming the effectiveness of the proposed approach in real-world classrooms.

References

[1]
Bybee, R.W., The case for STEM education: Challenges and opportunities. 2013.
[2]
Martín-Páez, T., et al., What are we talking about when we talk about STEM education? A review of literature. Science Education, 2019. 103(4): p. 799--822.
[3]
Sanders, M., Integrative STEM education: primer. The Technology Teacher, 2009. 68(4): p. 20--26.
[4]
Hsiao, J.-C., et al., Developing a plugged-in class observation protocol in high-school blended STEM classes: Student engagement, teacher behaviors and student-teacher interaction patterns. Computers & Education, 2022. 178: p. 104403.
[5]
Christensen, R., G. Knezek, and T. Tyler-Wood, Alignment of hands-on STEM engagement activities with positive STEM dispositions in secondary school students. Journal of Science Education and Technology, 2015. 24(6): p. 898--909.
[6]
Gao, X., et al., Reviewing assessment of student learning in interdisciplinary STEM education. International Journal of STEM Education, 2020. 7(1): p. 1--14.
[7]
Baumeister, R.F., K.D. Vohs, and D.C. Funder, Psychology as the science of self-reports and finger movements: Whatever happened to actual behavior? Perspectives on psychological science, 2007. 2(4): p. 396--403.
[8]
Paulhus, D.L. and S. Vazire, The self-report method. Handbook of research methods in personality psychology, 2007. 1(2007): p. 224--239
[9]
D'Mello, S., E. Dieterle, and A. Duckworth, Advanced, analytic, automated (AAA) measurement of engagement during learning. Educational psychologist, 2017. 52(2): p. 104--123.
[10]
Chi, M.T. and R. Wylie, The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational psychologist, 2014. 49(4): p. 219--243.
[11]
Harari, G.M., et al., Smartphone sensing methods for studying behavior in everyday life. Current opinion in behavioral sciences, 2017. 18: p. 83--90.
[12]
Lathia, N., et al. Contextual dissonance: Design bias in sensor-based experience sampling methods. in Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing. 2013.
[13]
Chen, J.C., et al., Developing a hands-on activity using virtual reality to help students learn by doing. Journal of Computer Assisted Learning, 2020. 36(1): p. 46--60.
[14]
Sun, D., et al., Comparing learners' knowledge, behaviors, and attitudes between two instructional modes of computer programming in secondary education. International journal of STEM education, 2021. 8(1): p. 1--15
[15]
Liu, T., Z. Chen, and X. Wang. Automatic instructional pointing gesture recognition by machine learning in the intelligent learning environment. in Proceedings of the 2019 4th international conference on distance education and learning. 2019.
[16]
Zhang, Z., et al., Data-driven online learning engagement detection via facial expression and mouse behavior recognition technology. Journal of Educational Computing Research, 2020. 58(1): p. 63--86.
[17]
Kim, H., et al., Evaluation of a Computer Vision-Based System to Analyse Behavioral Changes in High School Classrooms. International Journal of Information and Communication Technology Education (IJICTE), 2021. 17(4): p. 1--12.
[18]
Majd, M. and R. Safabakhsh, Correlational convolutional LSTM for human action recognition. Neurocomputing, 2020. 396: p. 224--229.
[19]
Schoneveld, L., A. Othmani, and H. Abdelkawy, Leveraging recent advances in deep learning for audio-visual emotion recognition. Pattern Recognition Letters, 2021. 146: p. 1--7.
[20]
Demrozi, F., et al., Human activity recognition using inertial, physiological and environmental sensors: A comprehensive survey. IEEE Access, 2020. 8: p. 210816--210836.
[21]
Nweke, H.F., et al., Data fusion and multiple classifier systems for human activity detection and health monitoring: Review and open research directions. Information Fusion, 2019. 46: p. 147--170.
[22]
Cui, W., et al., Device-free single-user activity recognition using diversified deep ensemble learning. Applied Soft Computing, 2021. 102: p. 107066.
[23]
Liu, J., G. Teng, and F. Hong, Human activity sensing with wireless signals: A survey. Sensors, 2020. 20(4): p. 1210.
[24]
Muhammad, K., et al., Human action recognition using attention based LSTM network with dilated CNN features. Future Generation Computer Systems, 2021. 125: p. 820--830.
[25]
Jaouedi, N., N. Boujnah, and M.S. Bouhlel, A new hybrid deep learning model for human action recognition. Journal of King Saud University-Computer and Information Sciences, 2020. 32(4): p. 447--453.
[26]
Khan, M.A., et al., Hand-crafted and deep convolutional neural network features fusion and selection strategy: an application to intelligent human action recognition. Applied Soft Computing, 2020. 87: p. 105986.
[27]
Kamel, A., et al., Deep convolutional neural networks for human action recognition using depth maps and postures. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2018. 49(9): p. 1806--1819.
[28]
Lin, K.-C., et al., The effect of real-time pose recognition on badminton learning performance. Interactive Learning Environments, 2021: p. 1--15.
[29]
Al-Naji, A., et al., Real time apnoea monitoring of children using the Microsoft Kinect sensor: a pilot study. Sensors, 2017. 17(2): p. 286.
[30]
Cao, Z., et al. Realtime multi-person 2d pose estimation using part affinity fields. in Proceedings of the IEEE conference on computer vision and pattern recognition. 2017.
[31]
Nakai, M., et al. Prediction of basketball free throw shooting by openpose. in JSAI International symposium on artificial intelligence. 2018. Springer.
[32]
Rathod, V., et al., Smart surveillance and real-time human action recognition using OpenPose, in ICDSMLA 2019. 2020, Springer. p. 504--509. 53
[33]
Hofstein, A. and V.N. Lunetta, The role of the laboratory in science teaching: Neglected aspects of research. Review of educational research, 1982. 52(2): p. 201--217.
[34]
Bochkovskiy, A., C.-Y. Wang, and H.-Y.M. Liao, Yolov4: Optimal speed and accuracy of object detection. arXiv preprint arXiv:2004.10934, 2020.
[35]
Thuneberg, H., H. Salmi, and F.X. Bogner, How creativity, autonomy and visual reasoning contribute to cognitive learning in a STEAM hands-on inquiry-based math module. Thinking Skills and Creativity, 2018. 29: p. 153--160.
[36]
Graesser, A.C., Emotions are the experiential glue of learning environments in the 21st century. Learning and Instruction, 2020. 70: p. 101212.
[37]
Liu, S., et al., Automated detection of emotional and cognitive engagement in MOOC discussions to predict learning achievement. Computers & Education, 2022. 181: p. 104461.

Cited By

View all
  • (2024)Economic Fruit Trees Recognition in Hillsides: A CNN-Based Approach Using Enhanced UAV ImageryIEEE Access10.1109/ACCESS.2024.339137112(61991-62005)Online publication date: 2024
  • (2024)ImageLMExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.122283238:PEOnline publication date: 27-Feb-2024
  • (2024)Bridging STEM Education and Ubiquitous Learning: A Case Study on Developing a LINE Chatbot with Google's Gemini for Virtual Peer CollaborationInnovative Technologies and Learning10.1007/978-3-031-65884-6_25(237-246)Online publication date: 21-Jul-2024

Index Terms

  1. Image4Assess: Automatic learning processes recognition using image processing

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      SAC '23: Proceedings of the 38th ACM/SIGAPP Symposium on Applied Computing
      March 2023
      1932 pages
      ISBN:9781450395175
      DOI:10.1145/3555776
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 07 June 2023

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. convolutional neural networks
      2. STEM
      3. learning behavior

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      SAC '23
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 1,650 of 6,669 submissions, 25%

      Upcoming Conference

      SAC '25
      The 40th ACM/SIGAPP Symposium on Applied Computing
      March 31 - April 4, 2025
      Catania , Italy

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)36
      • Downloads (Last 6 weeks)3
      Reflects downloads up to 16 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Economic Fruit Trees Recognition in Hillsides: A CNN-Based Approach Using Enhanced UAV ImageryIEEE Access10.1109/ACCESS.2024.339137112(61991-62005)Online publication date: 2024
      • (2024)ImageLMExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.122283238:PEOnline publication date: 27-Feb-2024
      • (2024)Bridging STEM Education and Ubiquitous Learning: A Case Study on Developing a LINE Chatbot with Google's Gemini for Virtual Peer CollaborationInnovative Technologies and Learning10.1007/978-3-031-65884-6_25(237-246)Online publication date: 21-Jul-2024

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media