skip to main content
10.1145/2207676.2208589acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

"Yours is better!": participant response bias in HCI

Authors Info & Claims
Published:05 May 2012Publication History

ABSTRACT

Although HCI researchers and practitioners frequently work with groups of people that differ significantly from themselves, little attention has been paid to the effects these differences have on the evaluation of HCI systems. Via 450 interviews in Bangalore, India, we measure participant response bias due to interviewer demand characteristics and the role of social and demographic factors in influencing that bias. We find that respondents are about 2.5x more likely to prefer a technological artifact they believe to be developed by the interviewer, even when the alternative is identical. When the interviewer is a foreign researcher requiring a translator, the bias towards the interviewer's artifact increases to 5x. In fact, the interviewer's artifact is preferred even when it is degraded to be obviously inferior to the alternative. We conclude that participant response bias should receive more attention within the CHI community, especially when designing for underprivileged populations.

References

  1. Anokwa, Y., Smyth, T., Ramachandran, D., Sherwani, J., Schwartzman, Y., Luk, R., Ho, M., Moraveji, N., and DeRenzi, B. Stories from the Field: Reflections on HCI4D Experiences. ITID 5, 4 (2009).Google ScholarGoogle Scholar
  2. Asch, S. Effects of Group Pressure Upon the Modification and Distortion of Judgements. In H. Guetzkow (ed.) Groups, Leadership, and Men (1951).Google ScholarGoogle Scholar
  3. Bernhart, M., Wiadnyana, I., Wihardjo, H., and Pohan, I. Patient Satisfaction in Developing Countries. Social Science and Medicine 48 (1999).Google ScholarGoogle Scholar
  4. Bignami-Van Assche, S., Reniers, G., and Weinreb, A. An Assessment of the KDICP and MDICP Data Quality. Demographic Research S1, 2 (2002).Google ScholarGoogle Scholar
  5. Brown, B., Reeves, S., and Sherwood, S. Into the Wild: Challenges and Opportunities for Field Trial Methods. In CHI (2005). Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Chavan, A. Another Culture, Another Method. In CHI (2005).Google ScholarGoogle Scholar
  7. Czerwinski, M., Horvitz, E., and Cutrell, E. Subjective Duration Assessment: An Implicit Probe for Software Usability. In IHM-HCI Conference (2001).Google ScholarGoogle Scholar
  8. Davis, R., Couper, M., Janz, N., Caldwell, C., and Resnicow, K. Interviewer Effects in Public Health Surveys. Health Education Research 25, 1 (2010).Google ScholarGoogle Scholar
  9. Evers, V. Cross-Cultural Understanding of Metaphors in Interface Design. In Proc. Cultural Attitudes towards Technology and Communication (1998).Google ScholarGoogle Scholar
  10. Evers, V., and Day, D. The Role of Culture in Interface Acceptance. In INTERACT'97 (1997). Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Ho, M., Smyth, T., Kam, M., and Dearden, A. Human-Computer Interaction for Development: The Past, Present, and Future. ITID 5, 4 (2009).Google ScholarGoogle Scholar
  12. Irani, L. HCI on the Move: Methods, Culture, Values. In CHI Extended Abstracts (2010). Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Jacobsen, N. The Evaluator Effect in Usability Studies: Problem Detection and Severity Judgments. In Proc. of the Human Factors and Ergonomics Society 42nd Annual Meeting (1998).Google ScholarGoogle Scholar
  14. Johnson, T., and J, P. Interviewer Effects on Self-Reported Substance Use Among Homeless Persons. Addict Behavior 19 (1994).Google ScholarGoogle Scholar
  15. Kane, S., Wobbrock, J., and Ladner, R. Usable Gestures for Blind People: Understanding Preference and Performance. In CHI (2011). Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Kirchler, E., and Davis, J. The Influence of Member Status Differences and Task Type on Group Consensus and Member Position Change. Personality and Social Psychology 51, 1 (1986).Google ScholarGoogle Scholar
  17. Laney, C., Kaasa, S., Morris, E., Berkowitz, S., Bernstein, D., and Loftus, E. The Red Herring technique: A Methodological Response to the Problem of Demand Characteristics. Psychological Research 72 (1962).Google ScholarGoogle Scholar
  18. Ledlie, J. Huzzah for my Thing: Evaluating a Pilot of a Mobile Service in Kenya. Qual Meets Quant, London, UK (2010).Google ScholarGoogle Scholar
  19. Mackay, W. Triangulation within and across HCI disciplines. Human-Computer Interaction 13, 3 (1998).Google ScholarGoogle Scholar
  20. Milgram, S. Behavioral Study of Obedience. Abnormal and Social Psychology 67, 4 (1963).Google ScholarGoogle ScholarCross RefCross Ref
  21. Nichols, A., and Maner, J. The Good Subject Effect: Investigating Participant Demand Characteristics. General Psychology 135 (2008).Google ScholarGoogle Scholar
  22. Orne, M. On the Social Psychology of the Psychological Experiment: With Particular Reference to Demand Characteristics and their Implications. American Psychologist 17 (1962).Google ScholarGoogle Scholar
  23. Paulhus, D. Measurement and Control of Response Bias. J. P. Robinson, P. R. Shaver and L. S. Wrightsman eds. Measures of personality and social psychological attitudes (1991).Google ScholarGoogle Scholar
  24. Read, J., and Fine, K. Using Survey Methods for Design and Evaluation in Child Computer Interaction. In Workshop on Child Computer Interaction: Methodological Research at Interact (2005).Google ScholarGoogle Scholar
  25. Rosnow, R., Goodstadt, B., Suls, J., and Gitter, G. More on the social psychology of the experiment: When compliance turns to self-defense. Personality and Social Psychology 27, 3 (1973).Google ScholarGoogle ScholarCross RefCross Ref
  26. Sawyer, A. Detecting Demand Characteristics in Laboratory Experiments in Consumer Research: The Case of Repetition-Affect Research. Advances in Consumer Research Volume 02, eds. Mary Jane Schlinger: Association for Consumer Research (1975).Google ScholarGoogle Scholar
  27. Strodtbeck, F., and Lipinski, R. Becoming First among Equals: Moral Considerations in Jury Foreman Selection. Personality and Social Psychology 49, 4 (1985).Google ScholarGoogle ScholarCross RefCross Ref
  28. Vatrapu, R., and Pérez-Quinones, M. Culture and Usability Evaluation: The Effects of Culture in Structured Interviews. Usability Studies 1 (2006).Google ScholarGoogle Scholar
  29. Weinreb, A. The Limitations of Stranger-Interviewers in Rural Kenya. American Sociological Review 71 (2006).Google ScholarGoogle Scholar
  30. Williams, J. Interviewer Role Performance: a Further Note on Bias in the Information Interview. Public Opinion Quarterly 32 (1968).Google ScholarGoogle Scholar
  31. Zizzo, D. Experimenter Demand Effects in Economic Experiments. Available at SSRN: http://ssrn.com/abstract=1163863 (last accessed 09/03/2011), 2008.Google ScholarGoogle Scholar

Index Terms

  1. "Yours is better!": participant response bias in HCI

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '12: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      May 2012
      3276 pages
      ISBN:9781450310154
      DOI:10.1145/2207676

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 5 May 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate6,199of26,314submissions,24%

      Upcoming Conference

      CHI '24
      CHI Conference on Human Factors in Computing Systems
      May 11 - 16, 2024
      Honolulu , HI , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader