ABSTRACT
Project-based learning has found its way into a range of formal and informal learning environments. However, systematically assessing these environments remains a significant challenge. Traditional assessments, which focus on learning outcomes, seem incongruent with the process-oriented goals of project-based learning. Multimodal interfaces and multimodal learning analytics hold significant promise for assessing learning in open-ended learning environments. With its rich integration of a multitude of data streams and naturalistic interfaces, this area of research may help usher in a new wave of education reform by supporting alternative modes of learning.
- Dewey, J. (1897). My Pedagogic Creed. School Journal 54, 77--80.Google Scholar
- Dewey, J. (1913). Interest and effort in education. Cambridge, MA: The Riverside Press.Google Scholar
- Vygotsky, L. S. (1978). Mind and society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.Google Scholar
- Papert, S. (1980). Mindstorms: children, computers, and powerful ideas. New York: Basic Books. Google ScholarDigital Library
- U.S. Department of Education, Office of Educational Technology (2010). Transforming American Education: Learning Powered by Technology. National Education Technology Plan 2010. Washington, DC.Google Scholar
- Kelly, T. and Copabianco, B. (2012) Think-aloud Protocol Analysis as a Measure of Students' Science Learning through Design Assessment. Paper Presented at National Association for Research in Science Teaching Annual Meeting, March 25-28, 2012, Indianapolis, IN.Google Scholar
- Shulman, L. S. (2006). From hermeneutic to homelitic: The professional formation of clergy. Change. March/April, 28--31.Google ScholarCross Ref
- Abrahamson, D. (2009). Orchestrating Semiotic Leaps from Tacit to Cultural Quantitative Reasoning - The Case of Anticipating Experimental Outcomes of a Quasi-Binomial Random Generator. Cognition and Instruction, 27(3), 175--224.Google ScholarCross Ref
- Barron. B. (2006). Interest and self-sustained learning as catalysts of development: A learning ecologies perspective. Human Development, 49, 193--224.Google ScholarCross Ref
- Barron, B. (2004). Learning ecologies for technological fluency: Gender and experience differences. Journal of Educational Computing Research, 31(1), 1--36.Google ScholarCross Ref
- diSessa, A.A. 2002. Why "conceptual ecology" is a good idea. In M. Limón & L. Mason (Eds.), Reconsidering conceptual change: Issues in theory and practice (pp. 29--60). Dordrecht: Kluwer.Google Scholar
- Litman, D., Moore, J., Dzikovksa, M. and Farrow, E. (2009). Using Natural Language Processing to Analyze Tutorial Dialogue Corpora Across Domains and Modalities. Proceedings 14th International Conference on Artificial Intelligence in Education (AIED), Brighton, UK, July. Google ScholarDigital Library
- Forbes-Riley, K., Rotaru, M. and Litman, J. (2009). The Relative Impact of Student Affect on Performance Models in a Spoken Dialogue Tutoring System. User Modeling and User-Adapted Interaction (Special Issue on Affective Modeling and Adaptation), 18(1-2), February, 11--43. Google ScholarDigital Library
- Forbes-Riley, K. and Litman, D. 2010. Metacognition and Learning in Spoken Dialogue Computer Tutoring. Proceedings 10th International Conference on Intelligent Tutoring Systems (ITS), Pittsburgh, PA. Google ScholarDigital Library
- Worsley, M. and Blikstein P. (2011). What's an Expert? Using learning analytics to identify emergent markers of expertise through automated speech, sentiment and sketch analysis. In Proceedings for the 4th Annual Conference on Educational Data Mining.Google Scholar
- Worsley, M. and Blikstein P. (2010) Towards the Development of Learning Analytics: Student Speech as an Automatic and Natural Form of Assessment. Paper Presented at the Annual Meeting of the American Education Research Association (AERA).Google Scholar
- Worsley, M. (2011). Qualifying Paper - What's an Expert? Using learning analytics to identify emergent markers of expertise through automated speech, sentiment and sketch analysis.Google Scholar
- Worsley, M. and Blikstein, P. (2012). An Eye For Detail: Techniques For Using Eye Tracker Data to Explore Learning in Computer-Mediated Environments. In the Proceedings of the 2012 International Conference of the Learning Sciences.Google Scholar
- Piech, C., Sahami, M., Koller, D., Cooper, S., & Blikstein, P. (2012). Modeling how students learn to program. Paper presented at the 43rd ACM technical symposium on Computer Science Education (SIGCSE '12). Google ScholarDigital Library
- Worsley, M. and Blikstein, P. (2012). A Framework for Characterizing Changes in Student Identity during Constructionist Learning Activities. In the Proceedings of Constructionism 2012.Google Scholar
- Worsley, M. and Huang, Z. (2012). Semi-Supervised Object Tracking. (unpublished).Google Scholar
- Worsley, M. and Huang, Z (2012). Mining through Mobile: Using Smart Phones to Monitor Student Action, Affect and Interaction In Open-Ended Learning Environments. (unpublished).Google Scholar
- Worsley, M. and Johnston, M. (2010). Multimodal Interactive Spaces: MagicTV and MagicMAP. IEEE Workshop on Spoken Language Technology (SLT) Demonstration.Google Scholar
- Worsley, M., Johnston, M. and Blikstein, P. (2011) OpenGesture: a low cost authoring framework for gesture and speech based application development and learning analytics. In Proceedings for the 10th Annual Conference on Interaction Design and Children. Google ScholarDigital Library
- Worsley, M. and Blikstein, P. (2012). OpenGesture: A Low-Cost, Easy-to-Author Application Framework for Collaborative, Gesture-, and Speech-Based Learning Applications. Paper Presented at the Annual Meeting of the American Education Research Association (AERA).Google Scholar
- Li, M., Ruiz-Primo, M. A. & Shaveloson, R. J. (2006). Towards a science achievement framework: The case of TIMSS 1999. In S. Howie & T. Plomp (Eds.), Contexts of learning mathematics and science: Lessons learned from TIMSS (pp. 291--311). London: RoutledgeGoogle Scholar
- Litman, D., Moore, J., Dzikovksa, M. and Farrow, E. (2009). Using Natural Language Processing to Analyze Tutorial Dialogue Corpora Across Domains and Modalities. Proceedings 14th International Conference on Artificial Intelligence in Education (AIED), Brighton, UK, July. Google ScholarDigital Library
- Worsley, M. and Blikstein P. (2010) Towards the Development of Learning Analytics: Student Speech as an Automatic and Natural Form of Assessment. Paper Presented at the Annual Meeting of the American Education Research Association (AERA).Google Scholar
Index Terms
- Multimodal learning analytics: enabling the future of learning through multimodal data analysis and interfaces
Recommendations
Multimodal learning analytics
LAK '13: Proceedings of the Third International Conference on Learning Analytics and KnowledgeNew high-frequency data collection technologies and machine learning analysis techniques could offer new insights into learning, especially in tasks in which students have ample space to generate unique, personalized artifacts, such as a computer ...
A constructionist learning environment for teachers to model learning designs
The use of digital technologies is now widespread and increasing, but is not always optimized for effective learning. Teachers in higher education have little time or support to work on innovation and improvement of their teaching, which often means ...
Multicraft: A Multimodal Interface for Supporting and Studying Learning in Minecraft
HCI in Games: Serious and Immersive GamesAbstractIn this paper, we present work on bringing multimodal interaction to Minecraft. The platform, Multicraft, incorporates speech-based input, eye tracking, and natural language understanding to facilitate more equitable gameplay in Minecraft. We ...
Comments