Abstract
It is well-established in the literature that particular design features of questionnaires affect the distribution and association of collected data. We present a survey approach called Crafty Questionnaire Design (CQED), that allows predictability and replicability of outcomes, expected of the natural sciences, to be achieved in the social sciences. Two independent proof-of-principle experiments studying interpersonal and institutional trust of Polish and Mexican students (n = 1402), show that using different versions of a questionnaire offers predictably different outcomes. CQED promises a large gain in efficiency of research in terms of sample size required and number of replications needed. This knowledge can safeguard the social scientific researcher against unpleasant surprises and inconvenient results. Knowledge about the principles of CQED could also be a tool for editors as well as reviewers of social scientific journals to scrutinize the methodological soundness and improve the relevance of publications.
“For a man is angry at a libel because it is false, but at a satire because it is true.” (G.K. Chesterton).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Aikin, S.F.: Poe’s law, group polarization, and argumentative failure in religious and political discourse. Soc. Semiot. 23(3), 301–317 (2013). https://doi.org/10.1080/10350330.2012.719728
Aquilino, W.S.: Interview mode effects in surveys of drug and alcohol use: a field experiment. Public Opin. Q. 58(2), 210–240 (1994). https://doi.org/10.1086/269419
Barch, D.M., Yarkoni, T.: Introduction to the special issue on reliability and replication in cognitive and affective neuroscience research. Cogn. Affect. Behav. Neurosci. 13(4), 687–689 (2013). https://doi.org/10.3758/s13415-013-0201-7
Benton, J.E., Daly, J.L.: A question order effect in a local government survey. Public Opin. Q. 55(4), 640–642 (1991). https://doi.org/10.1086/269285
Blalock, H.M.: Comment on Coleman’s paper. In: Bierstedt, R. (ed.) A Design for Sociology: Scope, Objectives, and Methods, pp. 115–121. American Academy of Political and Social Science, Philadelphia (1969)
Bowling, A.: Mode of questionnaire administration can have serious effects on data quality. J. Public Health 27(3), 281–291 (2005). https://doi.org/10.1093/pubmed/fdi031
Brace, I.: Questionnaire Design: How to Plan, Structure and Write Survey Material for Effective Market Research, 3rd edn. Kogan Page Limited, London (2013)
Bradburn, N.M., Sudman, S.: The current status of questionnaire research. In Biemer, P.P., Groves, R.M., Lyberg, L.E., Mathiowetz, N.A., Sudman, S. (eds.) Measurement Errors in Surveys, pp. 29–40. John Wiley & Sons, Inc., NewYork
Catania, J.A., McDermott, L.J., Pollack, L.M.: Questionnaire response bias and face-to-face interview sample bias in sexuality research. J. Sex Res. 22(1), 52–72 (1986). https://doi.org/10.1080/00224498609551289
Chan, J.C.: Response-order effects in Likert-Type scales. Educ. Psychol. Meas. 51(3), 531–540 (1991). https://doi.org/10.1177/0013164491513002
Chang, L.: A psychometric evaluation of 4-point and 6-point Likert-type scales in relation to reliability and validity. Appl. Psychol. Meas. 18(3), 205–215 (1994). https://doi.org/10.1177/014662169401800302
Christian, L.M., Dillman, D.A.: The influence of graphical and symbolic language manipulations on responses to self-administered questions. Public Opin. Q. 68(1), 57–80 (2004). https://doi.org/10.1093/poq/nfh004
Coleman, J.S.: The methods of sociology. In: Bierstedt R. (ed.) A Design for Sociology: Scope, Objectives, and Methods, pp. 86–114. American Academy of Political and Social Science, Philadelphia
Cook, E.: Exaggeration. Eliza Cook’s J. 7(171), 225–226 (1852)
Crocker, J., Cooper, M.L.: Addressing scientific fraud. Science 334(6060), 1182 (2011). https://doi.org/10.1126/science.1216775
Davis, J.A.: What’s wrong with sociology? Sociol. Forum 9(2), 179–197 (1994). https://doi.org/10.1007/BF01476361
De Leeuw, E., Collins, M.: Data collection methods and survey quality: an overview. In: Lyberg, L.E., et al. (eds.) Survey Measurement and Process Quality, pp. 197–220 (1997)
DeCastellarnau, A.: A classification of response scale characteristics that affect data quality: a literature review. Qual. Quant. 1–37 (2018). https://doi.org/10.1007/s11135-017-0533-4
Dickersin, K.: The existence of publication bias and risk factors for its occurrence. JAMA: J. Am. Med. Assoc. 263(10), 1385–1389 (1990). https://doi.org/10.1001/jama.1990.03440100097014
Dillman, D.A.: Mail and Telephone Surveys: The Total Design Method. Wiley, New York (1978)
Feynman, R.P.: Cargo cult science. Eng. Sci. 37(7), 10–13 (1974)
Fisher Jr, W.P., Stenner, A.J.: Integrating qualitative and quantitative research approaches via the phenomenological method. Int. J. Mult. Res. Approaches 5(1), 89–103 (2011). https://doi.org/10.5172/mra.2011.5.1.89
Fowler, F.J., Jr., Roman, A.M., Di, Z.X.: Mode effects in a survey of medicare prostate surgery patients. Public Opin. Q. 62(1), 29–46 (1998). https://doi.org/10.1086/297829
Fox, M.F.: Fraud, ethics, and the disciplinary contexts of science and scholarship. Am. Sociol. 21(1), 67–71 (1990). https://doi.org/10.1007/BF02691783
Gove, W.R., Geerken, M.R.: Response bias in surveys of mental health: an empirical investigation. Am. J. Sociol. 82(6), 1289–1317 (1977). https://doi.org/10.1086/226466
Grandcolas, U., Rettie, R., Marusenko, K.: Web survey bias: sample or mode effect? J. Mark. Manag. 19(5–6), 541–561 (2003). https://doi.org/10.1080/0267257X.2003.9728225
Hauser, P.M.: Comment on Coleman’s paper. In: Bierstedt, R. (ed.) A Design for Sociology: Scope, Objectives, and Methods, pp. 122–128. American Academy of Political and Social Science, Philadelphia (1969)
Hilgard, E.R.: Intervening variables, hypothetical constructs, parameters, and constants. Am. J. Psychol. 71(1), 238–246 (1958). https://doi.org/10.2307/1419211
Huff, D.: Norton, New York (1954)
Isaev, L.K.: The place of metrology in the science system: on postulates. Meas. Tech. 36(8), 853–854 (1993). https://doi.org/10.1007/BF00983977
Kagan, J.: The Three Cultures: Natural Sciences, Social Sciences, and the Humanities in the 21st Century. Cambridge University Press, New York (2009)
Kampen, J.K.: A proposal for the demarcation of theory and knowledge: of language-dependent and language-independent reality. Metaphilosophy 51(1), 97–110 (2020). https://doi.org/10.1111/meta.12398
Kampen, J.K., Tobi, H.: Social Scientific metrology as the mediator between sociology and socionomy: a cri de coeur for the systemizing of social indicators. In: Baird, C.M. (ed.) Social Indicators: Statistics, Trends and Policy Development, pp. 1–26. Nova Science Publishers, New York (2011)
Kampen, J.K., Van De Walle, S., Bouckaert, G.: Assessing the relation between satisfaction with public service delivery and trust in government the impact of the predisposition of citizens toward government on evalutations of its performance. Public Perform. Manag. Rev. 29(4), 387–440 (2006). https://doi.org/10.1080/15309576.2006.11051881
Kezar, A., Gehrke, S.: Why are we hiring so many non-tenure-track faculty? Lib. Educ. 100(1), n1 (2014)
Koch, S.: The nature and limits of psychological knowledge: lessons of a century qua “science.” Am. Psychol. 36(3), 257–269 (1981). https://doi.org/10.1037/0003-066X.36.3.257
Koch, S.: “Psychology” or “the psychological studies”? Am. Psychol. 48(8), 902–904 (1993). https://doi.org/10.1037/0003-066X.48.8.902
Krebs, D., Hoffmeyer-Zlotnik, J.H.: Positive first or negative first? effects of the order of answering categories on response behavior. Methodol.: Eur. J. Res. Methods Behav. Soc. Sci. 6(3), 118–127 (2010). https://doi.org/10.1027/1614-2241/a000013
Krosnick, J.A., Alwin, D.F.: An evaluation of a cognitive theory of response-order effects in survey measurement. Public Opin. Q. 51(2), 201–219 (1987). https://doi.org/10.1086/269029
Legg, C., Stagaki, P.: How to be a postmodernist: a user’s guide to postmodern rhetorical practices. J. Fam. Ther. 24(4), 385–401 (2002). https://doi.org/10.1111/1467-6427.00226
Leung, S.O.: A comparison of psychometric properties and normality in 4-, 5-, 6-, and 11-point Likert scales. J. Soc. Serv. Res. 37(4), 412–421 (2011). https://doi.org/10.1080/01488376.2011.580697
Lopreato, J., Crippen, T.: Crisis in Sociology: The Need for Darwin. Transaction, New Brunswick (1999)
MacCorquodale, K., Meehl, P.E.: On a distinction between hypothetical constructs and intervening variables. Psychol. Rev. 55(2), 95–107 (1948). https://doi.org/10.1037/h0056029
Malhotra, N.K.: Marketing Research: An Applied Orientation, 7th edn. Pearson, Boston (2019)
Masicampo, E., Lalande, D.R.: A peculiar prevalence of p values just below. 05. Q. J. Exp. Psychol. 65(11), 2271–2279 (2012)
McFarland, S.G.: Effects of question order on survey responses. Public Opin. Q. 45(2), 208–215 (1981). https://doi.org/10.1086/268651
Munafò, M.R., et al.: A manifesto for reproducible science. Nat. Hum. Behav. 1(1), 1–9 (2017)
Prus, R.: The interpretive challenge: the impending crisis in sociology. Can. J. Sociol./Cahiers canadiens de sociologie 15(3), 355–363 (1990). https://doi.org/10.2307/3340924
Rockwood, T.H., Sangster, R.L., Dillman, D.A.: The effect of response categories on questionnaire answers: context and mode effects. Sociol. Methods Res. 26(1), 118–140 (1997). https://doi.org/10.1177/0049124197026001004
Savage, M., Burrows, R.: The coming crisis of empirical sociology. Sociology 41(5), 885–899 (2007). https://doi.org/10.1177/0038038507080443
Schmaus, W.: Durkheim’s Philosophy of Science and the Sociology of Knowledge: Creating an Intellectual Niche. University of Chicago Press, Chicago (1994)
Schwarz, N.: Self-reports: how the questions shape the answers. Am. Psychol. 54(2), 93–105 (1999). https://doi.org/10.1037//0003-066x.54.2.93
Schwarz, N., Strack, F., Mai, H.P.: Assimilation and contrast effects in part-whole question sequences: a conversational logic analysis. Public Opin. Q. 55(1), 3–23 (1991). https://doi.org/10.1086/269239
Sijtsma, K.: Playing with data—or how to discourage questionable research practices and stimulate researchers to do things right. Psychometrika 81(1), 1–15 (2016). https://doi.org/10.1007/s11336-015-9446-0
Simmons, J.P., Nelson, L.D., Simonsohn, U.: False-positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol. Sci. 22(11), 1359–1366 (2011)
Simonsohn, U.: Just post it: the lesson from two cases of fabricated data detected by statistics alone. Psychol. Sci. 24(10), 1875–1888 (2013)
Simonsohn, U., Nelson, L.D., Simmons, J.P.: P-curve: a key to the file-drawer. J. Exp. Psychol. Gen. 143(2), 534 (2014)
Singer, E., Kohnke-Aguirre, L.: Interviewer expectation effects: a replication and extension. Public Opin. Q. 43(2), 245–260 (1979). https://doi.org/10.1086/268515
Steele, J.M.: Darrell Huff and fifty years of “how to lie with statistics.” Stat. Sci. 20(3), 205–209 (2005)
Stroebe, W., Postmes, T., Spears, R.: Scientific misconduct and the myth of self-correction in science. Perspect. Psychol. Sci. 7(6), 670–688 (2012). https://doi.org/10.1177/1745691612460687
Sudman, S., Bradburn, N.M.: Response Effects in Surveys: A Review and Synthesis. Aldine Publ. Co., Chicago (1974)
Tourangeau, R.: Cognitive aspects of survey measurement and mismeasurement. Int. J. Public Opin. Res. 15(1), 3–7 (2003). https://doi.org/10.1093/ijpor/15.1.3
Tourangeau, R., Couper, M.P., Conrad, F.: Spacing, position, and order: interpretive heuristics for visual features of survey questions. Public Opin. Q. 68(3), 368–393 (2004). https://doi.org/10.1093/poq/nfh035
Tourangeau, R., Couper, M.P., Conrad, F.: Color, labels, and interpretive heuristics for response scales. Public Opin. Q. 71(1), 91–112 (2007). https://doi.org/10.1093/poq/nfl046
Van de Walle, S., Kampen, J.K., Bouckaert, G.: Deep impact for high-impact agencies? assessing the role of bureaucratic encounters in evaluations of government. Public Perform. Manag. Rev. 28(4), 532–549 (2005). https://doi.org/10.1080/15309576.2005.11051846
Van De Walle, S., Van Ryzin, G.G.: The order of questions in a survey on citizen satisfaction with public services: lessons from a split-ballot experiment. Public Adm. 89(4), 1436–1450 (2011). https://doi.org/10.1111/j.1467-9299.2011.01922.x
Wigboldus, D.H.J., Dotsch, R.: Encourage playing with data and discourage questionable reporting practices. Psychometrika 81(1), 27–32 (2016). https://doi.org/10.1007/s11336-015-9445-1
Xu, M.L., Leung, S.O.: Effects of varying numbers of Likert scale points on factor structure of the Rosenberg Self‐Esteem Scale. Asian J. Soc. Psychol. 21(3), 119–128 (2018). https://doi.org/10.1111/ajsp.12214
Acknowledgements
We express our gratitude to David Zepeda Quintana (University of Sonora, Hermosillo, Mexico) and Bartosz Fortuński (Opole University, Poland) for their support with the collection of the data.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 Springer-Verlag GmbH Germany, part of Springer Nature
About this chapter
Cite this chapter
Kampen, J.K., Van Dam, Y.K., Platje, J.(. (2022). Lies, Damned Lies, and Crafty Questionnaire Design. In: Nguyen, N.T., Kowalczyk, R., Mercik, J., Motylska-Kuźma, A. (eds) Transactions on Computational Collective Intelligence XXXVII. Lecture Notes in Computer Science(), vol 13750. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-66597-8_4
Download citation
DOI: https://doi.org/10.1007/978-3-662-66597-8_4
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-66596-1
Online ISBN: 978-3-662-66597-8
eBook Packages: Computer ScienceComputer Science (R0)