Abstract
This paper introduces a framework for Respondent Behavior Logging (RBL), consisting of static and dynamic models that conceptualize respondent behavior when filling in online questionnaires, and visualization techniques and measurement constructs for RBL data. Web survey design may benefit from paradata logging as a technique for evaluation, since such data may prove useful during the re-design of questionnaires. Although other aspects of online surveys have attracted considerable attention both in the industry and in literature, it is still underexplored how the Web may leverage new and innovative techniques to support survey design. The RBL framework is evaluated using a focus group and through an experimental survey with 120 participants. We elaborate on implications for research and practice in an informed argument.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Arroyo, E., et al.: Usability tool for analysis of web designs using mouse tracks. In: 2006 Extended Abstracts on Human Factors in Computing Systems, pp. 484–489 (2006)
Atterer, R., et al.: Knowing the user’s every move: user activity tracking for website usability evaluation and implicit interaction. In: Proceedings of the 15th International Conference on World Wide Web, pp. 203–212 (2006)
Brancato, G., et al.: Handbook of recommended practices for questionnaire development and testing in the European statistical system. European Statistical System (2006)
Conboy, K.: Agility from first principles: reconstructing the concept of agility in information systems development. Inf. Syst. Res. 20(3), 329–354 (2009)
Davison, J., et al.: (Im) perfect pictures: snaplogs in performativity research. Qual. Res. Organ. Manag. Int. J. 7(1), 54–71 (2012)
Dewan, S., Ramaprasad, J.: Research note-music blogging, online sampling, and the long tail. Inf. Syst. Res. 23(3-Part-2), 1056–1067 (2012)
Garrett, J.J.: Ajax: a new approach to web applications (2005)
Gregor, S., Hevner, A.R.: Positioning and presenting design science research for maximum impact. Mis Q. 37(2), 337–355 (2013)
Gregor, S., Jones, D.: The anatomy of a design theory. J. Assoc. Inf. Syst. 8(5), 312–335 (2007)
Heerwegh, D.: Internet survey paradata. In: Social and Behavioral Research and the Internet, pp. 325–348. Routledge (2011)
Hevner, A.R., et al.: Design science in information systems research. Mis Q. 28(1), 75–105 (2004)
Koch, H., et al.: Bridging the work/social divide: the emotional response to organizational social networking sites. Eur. J. Inf. Syst. 21(6), 699–717 (2012)
Krosnick, J.A., Presser, S.: Question and questionnaire design. In: Marsden, P.V., Wright, J.D. (eds.) Handbook of Survey Research, pp. 263–313. Emerald Group Publishing Limited (2010)
Kunz, T., Hadler, P.: Collection and use of web paradata. Mannheim, GESIS-Leibniz-Institut für Sozialwissenschaften (GESIS Survey Guideline) (2020). https://doi.org/10.15465/gesis-sg_037
Lumsden, J., Morgan, W.: Online-questionnaire design: establishing guidelines and evaluating existing support (2005)
Lund, A.M.: Measuring usability with the use questionnaire. Usability Interface 8(2), 3–6 (2001)
Martin, D., et al.: Hidden surveillance by web sites: web bugs in contemporary use. Commun. ACM. 46(12), 258–264 (2003)
McClain, C.A., et al.: A typology of web survey paradata for assessing total survey error. Soc. Sci. Comput. Rev. 37(2), 196–213 (2019)
Morana, S., et al.: Tool support for design science researchtowards a software ecosystem: a report from a DESRIST 2017 workshop. Commun. Assoc. Inf. Syst. 43(1), 17 (2018)
Newsted, P.R., et al.: Survey instruments in information systems. MIS Q. 22(4), 553–554 (1998)
Nulty, D.D.: The adequacy of response rates to online and paper surveys: what can be done? Assess. Eval. High. Educ. 33(3), 301–314 (2008)
Nunamaker, J., Jay, F., Briggs, R.O.: Toward a broader vision for Information Systems. ACM Trans. Manag. Inf. Syst. 2(4), 1–12 (2011)
Sein, M., et al.: Action design research. MIS Q. 35(1), 37–56 (2011)
Sen, R., et al.: Buyers’ choice of online search strategy and its managerial implications. J. Manag. Inf. Syst. 23(1), 211–238 (2006)
Sivo, S.A., et al.: How low should you go? Low response rates and the validity of inference in IS questionnaire research. J. Assoc. Inf. Syst. 7(6), 351–414 (2006)
Sjöström, J., Kruse, L.C., Haj-Bolouri, A., Flensburg, P.: Software-embedded evaluation support in design science research. In: Chatterjee, S., Dutta, K., Sundarraj, R.P. (eds.) DESRIST 2018. LNCS, vol. 10844, pp. 348–362. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-91800-6_23
Sjöström, J., Ågerfalk, P.J., Hevner, A.R.: The design of a multi-layer scrutiny protocol to support online privacy and accountability. In: Tremblay, M.C., VanderMeer, D., Rothenberger, M., Gupta, A., Yoon, V. (eds.) DESRIST 2014. LNCS, vol. 8463, pp. 85–98. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-06701-8_6
Venable, J., et al.: FEDS: a framework for evaluation in design science research. Eur. J. Inf. Syst. 25(1), 77–89 (2016)
W3C: Extended log file format. https://www.w3.org/TR/WD-logfile.html
Zigmond, A.S., Snaith, R.P.: The hospital and anxiety depression scale. Acta Psychiatr. Scand. 67(6), 361–370 (1983)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Rahman, M.H., Sjöström, J. (2021). Respondent Behavior Logging: A Design Science Research Inquiry into Web Survey Paradata. In: Chandra Kruse, L., Seidel, S., Hausvik, G.I. (eds) The Next Wave of Sociotechnical Design. DESRIST 2021. Lecture Notes in Computer Science(), vol 12807. Springer, Cham. https://doi.org/10.1007/978-3-030-82405-1_25
Download citation
DOI: https://doi.org/10.1007/978-3-030-82405-1_25
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-82404-4
Online ISBN: 978-3-030-82405-1
eBook Packages: Computer ScienceComputer Science (R0)