Skip to main content

Respondent Behavior Logging: A Design Science Research Inquiry into Web Survey Paradata

  • Conference paper
  • First Online:
The Next Wave of Sociotechnical Design (DESRIST 2021)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12807))

  • 1529 Accesses

Abstract

This paper introduces a framework for Respondent Behavior Logging (RBL), consisting of static and dynamic models that conceptualize respondent behavior when filling in online questionnaires, and visualization techniques and measurement constructs for RBL data. Web survey design may benefit from paradata logging as a technique for evaluation, since such data may prove useful during the re-design of questionnaires. Although other aspects of online surveys have attracted considerable attention both in the industry and in literature, it is still underexplored how the Web may leverage new and innovative techniques to support survey design. The RBL framework is evaluated using a focus group and through an experimental survey with 120 participants. We elaborate on implications for research and practice in an informed argument.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Arroyo, E., et al.: Usability tool for analysis of web designs using mouse tracks. In: 2006 Extended Abstracts on Human Factors in Computing Systems, pp. 484–489 (2006)

    Google Scholar 

  2. Atterer, R., et al.: Knowing the user’s every move: user activity tracking for website usability evaluation and implicit interaction. In: Proceedings of the 15th International Conference on World Wide Web, pp. 203–212 (2006)

    Google Scholar 

  3. Brancato, G., et al.: Handbook of recommended practices for questionnaire development and testing in the European statistical system. European Statistical System (2006)

    Google Scholar 

  4. Conboy, K.: Agility from first principles: reconstructing the concept of agility in information systems development. Inf. Syst. Res. 20(3), 329–354 (2009)

    Article  Google Scholar 

  5. Davison, J., et al.: (Im) perfect pictures: snaplogs in performativity research. Qual. Res. Organ. Manag. Int. J. 7(1), 54–71 (2012)

    Article  Google Scholar 

  6. Dewan, S., Ramaprasad, J.: Research note-music blogging, online sampling, and the long tail. Inf. Syst. Res. 23(3-Part-2), 1056–1067 (2012)

    Article  Google Scholar 

  7. Garrett, J.J.: Ajax: a new approach to web applications (2005)

    Google Scholar 

  8. Gregor, S., Hevner, A.R.: Positioning and presenting design science research for maximum impact. Mis Q. 37(2), 337–355 (2013)

    Article  Google Scholar 

  9. Gregor, S., Jones, D.: The anatomy of a design theory. J. Assoc. Inf. Syst. 8(5), 312–335 (2007)

    Google Scholar 

  10. Heerwegh, D.: Internet survey paradata. In: Social and Behavioral Research and the Internet, pp. 325–348. Routledge (2011)

    Google Scholar 

  11. Hevner, A.R., et al.: Design science in information systems research. Mis Q. 28(1), 75–105 (2004)

    Article  Google Scholar 

  12. Koch, H., et al.: Bridging the work/social divide: the emotional response to organizational social networking sites. Eur. J. Inf. Syst. 21(6), 699–717 (2012)

    Article  Google Scholar 

  13. Krosnick, J.A., Presser, S.: Question and questionnaire design. In: Marsden, P.V., Wright, J.D. (eds.) Handbook of Survey Research, pp. 263–313. Emerald Group Publishing Limited (2010)

    Google Scholar 

  14. Kunz, T., Hadler, P.: Collection and use of web paradata. Mannheim, GESIS-Leibniz-Institut für Sozialwissenschaften (GESIS Survey Guideline) (2020). https://doi.org/10.15465/gesis-sg_037

  15. Lumsden, J., Morgan, W.: Online-questionnaire design: establishing guidelines and evaluating existing support (2005)

    Google Scholar 

  16. Lund, A.M.: Measuring usability with the use questionnaire. Usability Interface 8(2), 3–6 (2001)

    Google Scholar 

  17. Martin, D., et al.: Hidden surveillance by web sites: web bugs in contemporary use. Commun. ACM. 46(12), 258–264 (2003)

    Article  Google Scholar 

  18. McClain, C.A., et al.: A typology of web survey paradata for assessing total survey error. Soc. Sci. Comput. Rev. 37(2), 196–213 (2019)

    Article  Google Scholar 

  19. Morana, S., et al.: Tool support for design science researchtowards a software ecosystem: a report from a DESRIST 2017 workshop. Commun. Assoc. Inf. Syst. 43(1), 17 (2018)

    Google Scholar 

  20. Newsted, P.R., et al.: Survey instruments in information systems. MIS Q. 22(4), 553–554 (1998)

    Article  Google Scholar 

  21. Nulty, D.D.: The adequacy of response rates to online and paper surveys: what can be done? Assess. Eval. High. Educ. 33(3), 301–314 (2008)

    Article  Google Scholar 

  22. Nunamaker, J., Jay, F., Briggs, R.O.: Toward a broader vision for Information Systems. ACM Trans. Manag. Inf. Syst. 2(4), 1–12 (2011)

    Article  Google Scholar 

  23. Sein, M., et al.: Action design research. MIS Q. 35(1), 37–56 (2011)

    Article  Google Scholar 

  24. Sen, R., et al.: Buyers’ choice of online search strategy and its managerial implications. J. Manag. Inf. Syst. 23(1), 211–238 (2006)

    Article  Google Scholar 

  25. Sivo, S.A., et al.: How low should you go? Low response rates and the validity of inference in IS questionnaire research. J. Assoc. Inf. Syst. 7(6), 351–414 (2006)

    Google Scholar 

  26. Sjöström, J., Kruse, L.C., Haj-Bolouri, A., Flensburg, P.: Software-embedded evaluation support in design science research. In: Chatterjee, S., Dutta, K., Sundarraj, R.P. (eds.) DESRIST 2018. LNCS, vol. 10844, pp. 348–362. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-91800-6_23

    Chapter  Google Scholar 

  27. Sjöström, J., Ågerfalk, P.J., Hevner, A.R.: The design of a multi-layer scrutiny protocol to support online privacy and accountability. In: Tremblay, M.C., VanderMeer, D., Rothenberger, M., Gupta, A., Yoon, V. (eds.) DESRIST 2014. LNCS, vol. 8463, pp. 85–98. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-06701-8_6

    Chapter  Google Scholar 

  28. Venable, J., et al.: FEDS: a framework for evaluation in design science research. Eur. J. Inf. Syst. 25(1), 77–89 (2016)

    Article  Google Scholar 

  29. W3C: Extended log file format. https://www.w3.org/TR/WD-logfile.html

  30. Zigmond, A.S., Snaith, R.P.: The hospital and anxiety depression scale. Acta Psychiatr. Scand. 67(6), 361–370 (1983)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jonas Sjöström .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Rahman, M.H., Sjöström, J. (2021). Respondent Behavior Logging: A Design Science Research Inquiry into Web Survey Paradata. In: Chandra Kruse, L., Seidel, S., Hausvik, G.I. (eds) The Next Wave of Sociotechnical Design. DESRIST 2021. Lecture Notes in Computer Science(), vol 12807. Springer, Cham. https://doi.org/10.1007/978-3-030-82405-1_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-82405-1_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-82404-4

  • Online ISBN: 978-3-030-82405-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics