skip to main content
10.1145/2207676.2208589acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

"Yours is better!": participant response bias in HCI

Published: 05 May 2012 Publication History

Abstract

Although HCI researchers and practitioners frequently work with groups of people that differ significantly from themselves, little attention has been paid to the effects these differences have on the evaluation of HCI systems. Via 450 interviews in Bangalore, India, we measure participant response bias due to interviewer demand characteristics and the role of social and demographic factors in influencing that bias. We find that respondents are about 2.5x more likely to prefer a technological artifact they believe to be developed by the interviewer, even when the alternative is identical. When the interviewer is a foreign researcher requiring a translator, the bias towards the interviewer's artifact increases to 5x. In fact, the interviewer's artifact is preferred even when it is degraded to be obviously inferior to the alternative. We conclude that participant response bias should receive more attention within the CHI community, especially when designing for underprivileged populations.

References

[1]
Anokwa, Y., Smyth, T., Ramachandran, D., Sherwani, J., Schwartzman, Y., Luk, R., Ho, M., Moraveji, N., and DeRenzi, B. Stories from the Field: Reflections on HCI4D Experiences. ITID 5, 4 (2009).
[2]
Asch, S. Effects of Group Pressure Upon the Modification and Distortion of Judgements. In H. Guetzkow (ed.) Groups, Leadership, and Men (1951).
[3]
Bernhart, M., Wiadnyana, I., Wihardjo, H., and Pohan, I. Patient Satisfaction in Developing Countries. Social Science and Medicine 48 (1999).
[4]
Bignami-Van Assche, S., Reniers, G., and Weinreb, A. An Assessment of the KDICP and MDICP Data Quality. Demographic Research S1, 2 (2002).
[5]
Brown, B., Reeves, S., and Sherwood, S. Into the Wild: Challenges and Opportunities for Field Trial Methods. In CHI (2005).
[6]
Chavan, A. Another Culture, Another Method. In CHI (2005).
[7]
Czerwinski, M., Horvitz, E., and Cutrell, E. Subjective Duration Assessment: An Implicit Probe for Software Usability. In IHM-HCI Conference (2001).
[8]
Davis, R., Couper, M., Janz, N., Caldwell, C., and Resnicow, K. Interviewer Effects in Public Health Surveys. Health Education Research 25, 1 (2010).
[9]
Evers, V. Cross-Cultural Understanding of Metaphors in Interface Design. In Proc. Cultural Attitudes towards Technology and Communication (1998).
[10]
Evers, V., and Day, D. The Role of Culture in Interface Acceptance. In INTERACT'97 (1997).
[11]
Ho, M., Smyth, T., Kam, M., and Dearden, A. Human-Computer Interaction for Development: The Past, Present, and Future. ITID 5, 4 (2009).
[12]
Irani, L. HCI on the Move: Methods, Culture, Values. In CHI Extended Abstracts (2010).
[13]
Jacobsen, N. The Evaluator Effect in Usability Studies: Problem Detection and Severity Judgments. In Proc. of the Human Factors and Ergonomics Society 42nd Annual Meeting (1998).
[14]
Johnson, T., and J, P. Interviewer Effects on Self-Reported Substance Use Among Homeless Persons. Addict Behavior 19 (1994).
[15]
Kane, S., Wobbrock, J., and Ladner, R. Usable Gestures for Blind People: Understanding Preference and Performance. In CHI (2011).
[16]
Kirchler, E., and Davis, J. The Influence of Member Status Differences and Task Type on Group Consensus and Member Position Change. Personality and Social Psychology 51, 1 (1986).
[17]
Laney, C., Kaasa, S., Morris, E., Berkowitz, S., Bernstein, D., and Loftus, E. The Red Herring technique: A Methodological Response to the Problem of Demand Characteristics. Psychological Research 72 (1962).
[18]
Ledlie, J. Huzzah for my Thing: Evaluating a Pilot of a Mobile Service in Kenya. Qual Meets Quant, London, UK (2010).
[19]
Mackay, W. Triangulation within and across HCI disciplines. Human-Computer Interaction 13, 3 (1998).
[20]
Milgram, S. Behavioral Study of Obedience. Abnormal and Social Psychology 67, 4 (1963).
[21]
Nichols, A., and Maner, J. The Good Subject Effect: Investigating Participant Demand Characteristics. General Psychology 135 (2008).
[22]
Orne, M. On the Social Psychology of the Psychological Experiment: With Particular Reference to Demand Characteristics and their Implications. American Psychologist 17 (1962).
[23]
Paulhus, D. Measurement and Control of Response Bias. J. P. Robinson, P. R. Shaver and L. S. Wrightsman eds. Measures of personality and social psychological attitudes (1991).
[24]
Read, J., and Fine, K. Using Survey Methods for Design and Evaluation in Child Computer Interaction. In Workshop on Child Computer Interaction: Methodological Research at Interact (2005).
[25]
Rosnow, R., Goodstadt, B., Suls, J., and Gitter, G. More on the social psychology of the experiment: When compliance turns to self-defense. Personality and Social Psychology 27, 3 (1973).
[26]
Sawyer, A. Detecting Demand Characteristics in Laboratory Experiments in Consumer Research: The Case of Repetition-Affect Research. Advances in Consumer Research Volume 02, eds. Mary Jane Schlinger: Association for Consumer Research (1975).
[27]
Strodtbeck, F., and Lipinski, R. Becoming First among Equals: Moral Considerations in Jury Foreman Selection. Personality and Social Psychology 49, 4 (1985).
[28]
Vatrapu, R., and Pérez-Quinones, M. Culture and Usability Evaluation: The Effects of Culture in Structured Interviews. Usability Studies 1 (2006).
[29]
Weinreb, A. The Limitations of Stranger-Interviewers in Rural Kenya. American Sociological Review 71 (2006).
[30]
Williams, J. Interviewer Role Performance: a Further Note on Bias in the Information Interview. Public Opinion Quarterly 32 (1968).
[31]
Zizzo, D. Experimenter Demand Effects in Economic Experiments. Available at SSRN: http://ssrn.com/abstract=1163863 (last accessed 09/03/2011), 2008.

Cited By

View all
  • (2025)Pantograph: A Fluid and Typed Structure EditorProceedings of the ACM on Programming Languages10.1145/37048649:POPL(802-831)Online publication date: 9-Jan-2025
  • (2025)Novice-friendly probes for the gathering and analysis of requirements and subsequent design of softwareInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103405195:COnline publication date: 1-Jan-2025
  • (2025)Demand characteristics in human–computer experimentsInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103379193:COnline publication date: 1-Jan-2025
  • Show More Cited By

Index Terms

  1. "Yours is better!": participant response bias in HCI

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '12: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    May 2012
    3276 pages
    ISBN:9781450310154
    DOI:10.1145/2207676
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 May 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. bias
    2. culture
    3. demand characteristics
    4. hci4d
    5. ictd
    6. interviewer effects
    7. methods
    8. social status

    Qualifiers

    • Research-article

    Conference

    CHI '12
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)471
    • Downloads (Last 6 weeks)40
    Reflects downloads up to 13 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Pantograph: A Fluid and Typed Structure EditorProceedings of the ACM on Programming Languages10.1145/37048649:POPL(802-831)Online publication date: 9-Jan-2025
    • (2025)Novice-friendly probes for the gathering and analysis of requirements and subsequent design of softwareInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103405195:COnline publication date: 1-Jan-2025
    • (2025)Demand characteristics in human–computer experimentsInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103379193:COnline publication date: 1-Jan-2025
    • (2024)Woven Cultural Probes as Enablers for Multisensorial Design and Collaborative PracticesBlucher Design Proceedings10.5151/ead2023-1BIL-01Full-01Matz(1-17)Online publication date: May-2024
    • (2024)Creating a usable and effective digital intervention to support men to test for HIV and link to care in a resource-constrained setting: iterative design based on a Person-Based Approach and Human Computer Interaction methods (Preprint)JMIR Formative Research10.2196/65185Online publication date: 12-Aug-2024
    • (2024)Effectiveness and User Perception of an In-Vehicle Voice Warning for Hypoglycemia: Development and Feasibility TrialJMIR Human Factors10.2196/4282311(e42823)Online publication date: 9-Jan-2024
    • (2024)Challenges in Acceptance of Smartphone-Based Assistive Technologies: Extending the UTAUT Model for People With Visual ImpairmentsJournal of Visual Impairment & Blindness10.1177/0145482X241231990118:1(18-30)Online publication date: 6-Mar-2024
    • (2024)Rolling in Fun, Paying the Price: A Thematic Analysis on Purchase and Play in Tabletop GamesGames: Research and Practice10.1145/37006283:1(1-29)Online publication date: 17-Oct-2024
    • (2024)Chirp: The Impact of Private Online Self-Disclosure on Perceived Social SupportProceedings of the ACM on Human-Computer Interaction10.1145/36869668:CSCW2(1-25)Online publication date: 8-Nov-2024
    • (2024)TimelyTale: A Multimodal Dataset Approach to Assessing Passengers' Explanation Demands in Highly Automated VehiclesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785448:3(1-60)Online publication date: 9-Sep-2024
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media