Skip to main content

Psychometric Challenges in Modeling Scientific Problem-Solving Competency: An Item Response Theory Approach

  • Conference paper
Data Science, Learning by Latent Structures, and Knowledge Discovery

Abstract

The ability to solve complex problems is one of the key competencies in science. In previous research, modeling scientific problem solving has mainly focused on the dimensionality of the construct, but rarely addressed psychometric test characteristics such as local item dependencies which could occur, especially in computer-based assessments. The present study consequently aims to model scientific problem solving by taking into account four components of the construct and dependencies among items within these components. Based on a data set of 1,487 German high-school students of different grade levels, who worked on computer-based assessments of problem solving, local item dependencies were quantified by using testlet models and Q 3 statistics. The results revealed that a model differentiating testlets of cognitive processes and virtual systems fitted the data best and remained invariant across grades.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences (2nd ed.). Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Brandt, S. (2012). Robustness of multidimensional analyses against local item dependence. Psychological Test and Assessment Modeling, 54, 36–53.

    Google Scholar 

  • De Ayala, R. J. (2009). The theory and practice of item response theory. New York, NY: The Guilford Press.

    Google Scholar 

  • De Boeck, P., & Wilson, M. (2004). Explanatory item response models. New York, NY: Springer.

    Book  MATH  Google Scholar 

  • DeMars, C. E. (2012). Confirming testlet effects. Applied Psychological Measurement, 36, 104–121.

    Article  Google Scholar 

  • Funke, J. (2010). Complex problem solving: A case for complex cognition? Cognitive Processing, 11, 133–142.

    Article  Google Scholar 

  • Ip, E. H. (2010). Empirically indistinguishable multidimensional IRT and locally dependent unidimensional item response models. British Journal of Mathematical and Statistical Psychology, 63, 395–416.

    Article  MathSciNet  Google Scholar 

  • Köller, O., & Parchmann, I. (2012). Competencies: The German notion of learning outcomes. In S. Bernholt, K. Neumann, & P. Nentwig (Eds.), Making it tangible – Learning outcomes in science education (pp. 165–185). Münster: Waxmann.

    Google Scholar 

  • Kolen, M. J., & Brennan, R. L. (2004). Test equating, scaling, and linking. New York, NY: Springer.

    Book  MATH  Google Scholar 

  • Koppelt, J. (2011). Modellierung dynamischer Problemlösekompetenz im Chemieunterricht. Berlin: Mensch & Buch.

    Google Scholar 

  • Lucke, J. F. (2005). “Rassling the Hog”: The influence of correlated item error on internal consistency, classical reliability, and congeneric reliability. Applied Psychological Measurement, 29, 106–125.

    Article  MathSciNet  Google Scholar 

  • Marais, I., & Andrich, D. (2008). Formalizing dimension and response violations of local independence in the unidimensional Rasch model. Journal of Applied Measurement, 9, 200–215.

    Google Scholar 

  • Millsap, R. E. (2011). Statistical approaches to measurement invariance. New York, NY: Taylor & Francis.

    Google Scholar 

  • OECD. (2004). Problem solving for tomorrow’s world. Paris: OECD.

    Google Scholar 

  • Ragni, M., & Löffler, C. M. (2010). Complex problem solving: another test case? Cognitive Processing, 11, 159–170.

    Article  Google Scholar 

  • Robitzsch, A. (2013). R package sirt – Supplementary functions for item response theory. Salzburg: Bifie.

    Google Scholar 

  • Robitzsch, A., Dörfler, T., Pfost, M., & Artelt, C. (2011). Die Bedeutung der Itemauswahl und die Modellwahl für die längsschnittliche Erfassung von Kompetenzen. Zeitschrift für Entwicklungspsychologie und Pädagogische Psychologie, 43, 213–227.

    Article  Google Scholar 

  • Scherer, R. (2012). Analyse der Struktur, Ausprägung und Messinvarianz komplexer Problemlösekompetenz im Fach Chemie. Berlin: Logos.

    Google Scholar 

  • Scherer, R., & Tiemann, R. (2014). Evidence on the effects of task interactivity and grade level on thinking skills involved in complex problem solving. Thinking Skills and Creativity, 11, 48–64.

    Article  Google Scholar 

  • Wirth, J., & Klieme, E. (2004). Computer-based assessment of problem solving competence. Assessment in Education: Principles, Policy and Practice, 10, 329–345.

    Article  Google Scholar 

  • Wu, M., Adams, R., Wilson, M., & Haldane, S. (2007). ACER ConQuest 2.0. Camberwell: ACER.

    Google Scholar 

  • Wüstenberg, S., Greiff, S., & Funke, J. (2012). Complex problem solving – more than reasoning? Intelligence, 40, 1–14.

    Article  Google Scholar 

  • Yen, W. M. (1993). Scaling performance assessments: Strategies for managing local item dependence. Journal of Educational Measurement, 30, 187–213.

    Article  Google Scholar 

Download references

Acknowledgements

The author wishes to thank Professor Dr. Rüdiger Tiemann (Humboldt-Universität zu Berlin, Germany) for his conceptual support in conducting the proposed study. This research has been partly funded by a grant of the German Academic Exchange Service (DAAD).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ronny Scherer .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Scherer, R. (2015). Psychometric Challenges in Modeling Scientific Problem-Solving Competency: An Item Response Theory Approach. In: Lausen, B., Krolak-Schwerdt, S., Böhmer, M. (eds) Data Science, Learning by Latent Structures, and Knowledge Discovery. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-44983-7_33

Download citation

Publish with us

Policies and ethics