Abstract
The ability to solve complex problems is one of the key competencies in science. In previous research, modeling scientific problem solving has mainly focused on the dimensionality of the construct, but rarely addressed psychometric test characteristics such as local item dependencies which could occur, especially in computer-based assessments. The present study consequently aims to model scientific problem solving by taking into account four components of the construct and dependencies among items within these components. Based on a data set of 1,487 German high-school students of different grade levels, who worked on computer-based assessments of problem solving, local item dependencies were quantified by using testlet models and Q 3 statistics. The results revealed that a model differentiating testlets of cognitive processes and virtual systems fitted the data best and remained invariant across grades.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences (2nd ed.). Mahwah, NJ: Lawrence Erlbaum.
Brandt, S. (2012). Robustness of multidimensional analyses against local item dependence. Psychological Test and Assessment Modeling, 54, 36–53.
De Ayala, R. J. (2009). The theory and practice of item response theory. New York, NY: The Guilford Press.
De Boeck, P., & Wilson, M. (2004). Explanatory item response models. New York, NY: Springer.
DeMars, C. E. (2012). Confirming testlet effects. Applied Psychological Measurement, 36, 104–121.
Funke, J. (2010). Complex problem solving: A case for complex cognition? Cognitive Processing, 11, 133–142.
Ip, E. H. (2010). Empirically indistinguishable multidimensional IRT and locally dependent unidimensional item response models. British Journal of Mathematical and Statistical Psychology, 63, 395–416.
Köller, O., & Parchmann, I. (2012). Competencies: The German notion of learning outcomes. In S. Bernholt, K. Neumann, & P. Nentwig (Eds.), Making it tangible – Learning outcomes in science education (pp. 165–185). Münster: Waxmann.
Kolen, M. J., & Brennan, R. L. (2004). Test equating, scaling, and linking. New York, NY: Springer.
Koppelt, J. (2011). Modellierung dynamischer Problemlösekompetenz im Chemieunterricht. Berlin: Mensch & Buch.
Lucke, J. F. (2005). “Rassling the Hog”: The influence of correlated item error on internal consistency, classical reliability, and congeneric reliability. Applied Psychological Measurement, 29, 106–125.
Marais, I., & Andrich, D. (2008). Formalizing dimension and response violations of local independence in the unidimensional Rasch model. Journal of Applied Measurement, 9, 200–215.
Millsap, R. E. (2011). Statistical approaches to measurement invariance. New York, NY: Taylor & Francis.
OECD. (2004). Problem solving for tomorrow’s world. Paris: OECD.
Ragni, M., & Löffler, C. M. (2010). Complex problem solving: another test case? Cognitive Processing, 11, 159–170.
Robitzsch, A. (2013). R package sirt – Supplementary functions for item response theory. Salzburg: Bifie.
Robitzsch, A., Dörfler, T., Pfost, M., & Artelt, C. (2011). Die Bedeutung der Itemauswahl und die Modellwahl für die längsschnittliche Erfassung von Kompetenzen. Zeitschrift für Entwicklungspsychologie und Pädagogische Psychologie, 43, 213–227.
Scherer, R. (2012). Analyse der Struktur, Ausprägung und Messinvarianz komplexer Problemlösekompetenz im Fach Chemie. Berlin: Logos.
Scherer, R., & Tiemann, R. (2014). Evidence on the effects of task interactivity and grade level on thinking skills involved in complex problem solving. Thinking Skills and Creativity, 11, 48–64.
Wirth, J., & Klieme, E. (2004). Computer-based assessment of problem solving competence. Assessment in Education: Principles, Policy and Practice, 10, 329–345.
Wu, M., Adams, R., Wilson, M., & Haldane, S. (2007). ACER ConQuest 2.0. Camberwell: ACER.
Wüstenberg, S., Greiff, S., & Funke, J. (2012). Complex problem solving – more than reasoning? Intelligence, 40, 1–14.
Yen, W. M. (1993). Scaling performance assessments: Strategies for managing local item dependence. Journal of Educational Measurement, 30, 187–213.
Acknowledgements
The author wishes to thank Professor Dr. Rüdiger Tiemann (Humboldt-Universität zu Berlin, Germany) for his conceptual support in conducting the proposed study. This research has been partly funded by a grant of the German Academic Exchange Service (DAAD).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Scherer, R. (2015). Psychometric Challenges in Modeling Scientific Problem-Solving Competency: An Item Response Theory Approach. In: Lausen, B., Krolak-Schwerdt, S., Böhmer, M. (eds) Data Science, Learning by Latent Structures, and Knowledge Discovery. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-44983-7_33
Download citation
DOI: https://doi.org/10.1007/978-3-662-44983-7_33
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-44982-0
Online ISBN: 978-3-662-44983-7
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)