ABSTRACT
Although HCI researchers and practitioners frequently work with groups of people that differ significantly from themselves, little attention has been paid to the effects these differences have on the evaluation of HCI systems. Via 450 interviews in Bangalore, India, we measure participant response bias due to interviewer demand characteristics and the role of social and demographic factors in influencing that bias. We find that respondents are about 2.5x more likely to prefer a technological artifact they believe to be developed by the interviewer, even when the alternative is identical. When the interviewer is a foreign researcher requiring a translator, the bias towards the interviewer's artifact increases to 5x. In fact, the interviewer's artifact is preferred even when it is degraded to be obviously inferior to the alternative. We conclude that participant response bias should receive more attention within the CHI community, especially when designing for underprivileged populations.
- Anokwa, Y., Smyth, T., Ramachandran, D., Sherwani, J., Schwartzman, Y., Luk, R., Ho, M., Moraveji, N., and DeRenzi, B. Stories from the Field: Reflections on HCI4D Experiences. ITID 5, 4 (2009).Google Scholar
- Asch, S. Effects of Group Pressure Upon the Modification and Distortion of Judgements. In H. Guetzkow (ed.) Groups, Leadership, and Men (1951).Google Scholar
- Bernhart, M., Wiadnyana, I., Wihardjo, H., and Pohan, I. Patient Satisfaction in Developing Countries. Social Science and Medicine 48 (1999).Google Scholar
- Bignami-Van Assche, S., Reniers, G., and Weinreb, A. An Assessment of the KDICP and MDICP Data Quality. Demographic Research S1, 2 (2002).Google Scholar
- Brown, B., Reeves, S., and Sherwood, S. Into the Wild: Challenges and Opportunities for Field Trial Methods. In CHI (2005). Google ScholarDigital Library
- Chavan, A. Another Culture, Another Method. In CHI (2005).Google Scholar
- Czerwinski, M., Horvitz, E., and Cutrell, E. Subjective Duration Assessment: An Implicit Probe for Software Usability. In IHM-HCI Conference (2001).Google Scholar
- Davis, R., Couper, M., Janz, N., Caldwell, C., and Resnicow, K. Interviewer Effects in Public Health Surveys. Health Education Research 25, 1 (2010).Google Scholar
- Evers, V. Cross-Cultural Understanding of Metaphors in Interface Design. In Proc. Cultural Attitudes towards Technology and Communication (1998).Google Scholar
- Evers, V., and Day, D. The Role of Culture in Interface Acceptance. In INTERACT'97 (1997). Google ScholarDigital Library
- Ho, M., Smyth, T., Kam, M., and Dearden, A. Human-Computer Interaction for Development: The Past, Present, and Future. ITID 5, 4 (2009).Google Scholar
- Irani, L. HCI on the Move: Methods, Culture, Values. In CHI Extended Abstracts (2010). Google ScholarDigital Library
- Jacobsen, N. The Evaluator Effect in Usability Studies: Problem Detection and Severity Judgments. In Proc. of the Human Factors and Ergonomics Society 42nd Annual Meeting (1998).Google Scholar
- Johnson, T., and J, P. Interviewer Effects on Self-Reported Substance Use Among Homeless Persons. Addict Behavior 19 (1994).Google Scholar
- Kane, S., Wobbrock, J., and Ladner, R. Usable Gestures for Blind People: Understanding Preference and Performance. In CHI (2011). Google ScholarDigital Library
- Kirchler, E., and Davis, J. The Influence of Member Status Differences and Task Type on Group Consensus and Member Position Change. Personality and Social Psychology 51, 1 (1986).Google Scholar
- Laney, C., Kaasa, S., Morris, E., Berkowitz, S., Bernstein, D., and Loftus, E. The Red Herring technique: A Methodological Response to the Problem of Demand Characteristics. Psychological Research 72 (1962).Google Scholar
- Ledlie, J. Huzzah for my Thing: Evaluating a Pilot of a Mobile Service in Kenya. Qual Meets Quant, London, UK (2010).Google Scholar
- Mackay, W. Triangulation within and across HCI disciplines. Human-Computer Interaction 13, 3 (1998).Google Scholar
- Milgram, S. Behavioral Study of Obedience. Abnormal and Social Psychology 67, 4 (1963).Google ScholarCross Ref
- Nichols, A., and Maner, J. The Good Subject Effect: Investigating Participant Demand Characteristics. General Psychology 135 (2008).Google Scholar
- Orne, M. On the Social Psychology of the Psychological Experiment: With Particular Reference to Demand Characteristics and their Implications. American Psychologist 17 (1962).Google Scholar
- Paulhus, D. Measurement and Control of Response Bias. J. P. Robinson, P. R. Shaver and L. S. Wrightsman eds. Measures of personality and social psychological attitudes (1991).Google Scholar
- Read, J., and Fine, K. Using Survey Methods for Design and Evaluation in Child Computer Interaction. In Workshop on Child Computer Interaction: Methodological Research at Interact (2005).Google Scholar
- Rosnow, R., Goodstadt, B., Suls, J., and Gitter, G. More on the social psychology of the experiment: When compliance turns to self-defense. Personality and Social Psychology 27, 3 (1973).Google ScholarCross Ref
- Sawyer, A. Detecting Demand Characteristics in Laboratory Experiments in Consumer Research: The Case of Repetition-Affect Research. Advances in Consumer Research Volume 02, eds. Mary Jane Schlinger: Association for Consumer Research (1975).Google Scholar
- Strodtbeck, F., and Lipinski, R. Becoming First among Equals: Moral Considerations in Jury Foreman Selection. Personality and Social Psychology 49, 4 (1985).Google ScholarCross Ref
- Vatrapu, R., and Pérez-Quinones, M. Culture and Usability Evaluation: The Effects of Culture in Structured Interviews. Usability Studies 1 (2006).Google Scholar
- Weinreb, A. The Limitations of Stranger-Interviewers in Rural Kenya. American Sociological Review 71 (2006).Google Scholar
- Williams, J. Interviewer Role Performance: a Further Note on Bias in the Information Interview. Public Opinion Quarterly 32 (1968).Google Scholar
- Zizzo, D. Experimenter Demand Effects in Economic Experiments. Available at SSRN: http://ssrn.com/abstract=1163863 (last accessed 09/03/2011), 2008.Google Scholar
Index Terms
- "Yours is better!": participant response bias in HCI
Recommendations
'You Can Always Do Better!": The Impact of Social Proof on Participant Response Bias
CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing SystemsEvaluations of technological artifacts in HCI4D contexts are known to suffer from high levels of participant response bias---where participants only provide positive feedback that they think will please the researcher. This paper describes a practical, ...
A Family Health App: Engaging Children to Manage Wellness of Adults
ACM DEV '16: Proceedings of the 7th Annual Symposium on Computing for DevelopmentThe pandemic of lifestyle-related chronic diseases has led to an advent of personal health informatics, often using mobiles and gamification to persuade individuals to adopt healthful lifestyles. However, this approach can constrain benefits to younger, ...
The Benefits of Using Design Workbooks with Speculative Design Proposals in Information Communication Technology for Development (ICTD)
DIS '21: Proceedings of the 2021 ACM Designing Interactive Systems ConferenceThis article argues that design workbooks can benefit the field of Information Communication Technology and Development (ICTD). To demonstrate this, I present a workbook comprised of 12 speculative design proposals. I then present findings from ...
Comments