skip to main content
10.1145/3441852.3471216acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article

Accept or Address? Researchers’ Perspectives on Response Bias in Accessibility Research

Published: 17 October 2021 Publication History

Editorial Notes

The authors have requested minor, non-substantive changes to the VoR and, in accordance with ACM policies, a Corrected VoR was published on November 5, 2021. For reference purposes the VoR may still be accessed via the Supplemental Material section on this page.

Abstract

Response bias has been framed as the tendency of a participant's response to be skewed by a variety of factors, including study design and participant-researcher dynamics. Response bias is a concern for all researchers who conduct studies with people — especially those working with participants with disabilities. This is because these participants’ diverse needs require methodological adjustments and differences in disability identity between the researcher and participant influence power dynamics. Despite its relevance, there is little literature that connects response bias to accessibility. We conducted semi-structured interviews with 27 accessibility researchers on how response bias manifested in their research and how they mitigated it. We present unique instances of response bias and how it is handled in accessibility research; insights into how response bias interacts with other biases like researcher or sampling bias; and philosophies and tensions around response bias such as whether to accept or address it. We conclude with guidelines on thinking about response bias in accessibility research.

Supplementary Material

3471216-vor (3471216-vor.pdf)
Version of Record for "Accept or Address? Researchers' Perspectives on Response Bias in Accessibility Research" by Ming et al., The 23rd International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '21).

References

[1]
Judd Antin and Aaron Shaw. 2012. Social desirability bias and self-reports of motivation: a study of amazon mechanical turk in the US and India. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’12), 2925–2934.
[2]
Fabricio E. Balcazar, Christopher B. Keys, Daniel L. Kaplan, and Yolanda Suarez-Balcazar. 1998. Participatory action research and people with disabilities: Principles and challenges. Canadian Journal of rehabilitation 12: 105–112.
[3]
Shaowen Bardzell. 2010. Feminist HCI: taking stock and outlining an agenda for design. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’10), 1301–1310.
[4]
Shaowen Bardzell and Jeffrey Bardzell. 2011. Towards a feminist HCI methodology: social science, feminism, and HCI. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’11), 675–684.
[5]
Brianna Blaser and Richard E. Ladner. 2020. Why is Data on Disability so Hard to Collect and Understand? Research on Equity and Sustained Participation in Engineering, Computing, and Technology (RESPECT) 1. Retrieved from http://respect2020.stcbp.org/wp-content/uploads/2020/08/Research_1_RESPECT_2020_Blaser_Ladner.pdf
[6]
Susanne Bødker and Morten Kyng. 2018. Participatory Design that Matters—Facing the Big Issues. ACM Trans. Comput.-Hum. Interact. 25, 1: 1–31.
[7]
Mayara Bonani, Raquel Oliveira, Filipa Correia, André Rodrigues, Tiago Guerreiro, and Ana Paiva. 2018. What My Eyes Can't See, A Robot Can Show Me: Exploring the Collaboration Between Blind People and Robots. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’18), 15–27.
[8]
Glenn H. Bracht and Gene V. Glass. 1968. The External Validity of Experiments. American educational research journal 5, 4: 437–474.
[9]
V. Braun and V. Clarke. 2006. Using thematic analysis in psychology. Qualitative research in psychology. Retrieved from https://www.tandfonline.com/doi/abs/10.1191/1478088706QP063OA
[10]
Barry Brown, Stuart Reeves, and Scott Sherwood. 2011. Into the wild: challenges and opportunities for field trial methods. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’11), 1657–1666.
[11]
Emeline Brulé and Katta Spiel. 2019. Negotiating Gender and Disability Identities in Participatory Design. In Proceedings of the 9th International Conference on Communities & Technologies - Transforming Communities (C&T ’19), 218–227.
[12]
Loïc Caroux, Charles Consel, Lucile Dupuy, and Hélène Sauzéon. 2014. Verification of daily activities of older adults: a simple, non-intrusive, low-cost approach. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility (ASSETS ’14), 43–50.
[13]
Kathy Charmaz. 2011. Grounded Theory Methods in Social Justice Research. In The SAGE Handbook of Qualitative Research, Norman K Denzin Yvonna (ed.). SAGE Publications, 359–380.
[14]
A. Chavan and E. Schaffer. 2004. The Bollywood method. E. Schaffer 5: 129–130.
[15]
Ronald J. Chenail. 2011. Interviewing the investigator: Strategies for addressing instrumentation and researcher bias concerns in qualitative research. Qualitative report 16, 1: 255–262.
[16]
Sasha Costanza-Chock. 2020. Design Justice: Community-Led Practices to Build the Worlds We Need (Information Policy). The MIT Press.
[17]
Mick P. Couper, Roger Tourangeau, and Darby M. Steiger. 2001. Social presence in Web surveys. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’01), 412–417.
[18]
Nicola Dell, Vidya Vaidyanathan, Indrani Medhi, Edward Cutrell, and William Thies. 2012. “Yours is better!”: participant response bias in HCI. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’12), 1321–1330.
[19]
Barbara Snell Dohrenwend, John Colombotos, and Bruce P. Dohrenwend. 1968. Social distance and interviewer effects. Public opinion quarterly 32, 3: 410–422.
[20]
Katherine Pratt Ewing. 2006. Revealing and concealing: Interpersonal dynamics and the negotiation of identity in the interview. Ethos 34, 1: 89–122.
[21]
Robert J. Fisher. 1993. Social Desirability Bias and the Validity of Indirect Questioning. The Journal of consumer research 20, 2: 303–315.
[22]
Rachel L. Franz, Ron Baecker, and Khai N. Truong. 2018. “I knew that, I was just testing you.” ACM transactions on accessible computing 11, 3: 1–23.
[23]
Christopher Frauenberger. 2015. Disability and Technology: A Critical Realist Perspective. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility (ASSETS ’15), 89–96.
[24]
Christopher Frauenberger. 2016. Critical Realist HCI. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA ’16), 341–351.
[25]
H. E. Freeman, K. J. Kiecolt, W. L. Nicholls 2nd, and J. M. Shanks. 1982. Telephone sampling bias in surveying disability. Public opinion quarterly 46, 3: 392–407.
[26]
Adrian Furnham. 1986. Response bias, social desirability and dissimulation. Personality and individual differences 7, 3: 385–400.
[27]
Roger O. Gervais, Yossef S. Ben-Porath, Dustin B. Wygant, and Paul Green. 2007. Development and validation of a Response Bias Scale (RBS) for the MMPI-2. Assessment 14, 2: 196–208.
[28]
Wayne D. Gray and Marilyn C. Salzman. 1998. Damaged Merchandise? A Review of Experiments That Compare Usability Evaluation Methods. Human–Computer Interaction 13, 3: 203–261.
[29]
Giampiero Griffo. 2014. Models of disability, ideas of justice, and the challenge of full participation. Modern Italy: journal of the Association for the Study of Modern Italy 19, 2: 147–159.
[30]
Gordon H. Guyatt and Jason W. Busse. 2006. The Philosophy of Evidence-Based Medicine. In Evidence-Based Endocrinology, Victor M. Montori (ed.). Humana Press, Totowa, NJ, 25–33.
[31]
James J. Heckman. 1979. Sample Selection Bias as a Specification Error. Econometrica: journal of the Econometric Society 47, 1: 153–161.
[32]
Gary Hsieh and Rafał Kocielnik. 2016. You Get Who You Pay for: The Impact of Incentives on Participation Bias. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (CSCW ’16), 823–835.
[33]
Jackson, Liz (tear down every statue of Andrew). 2019. Disability Dongle: A well intended elegant, yet useless solution to a problem we never knew we had. Disability Dongles are most often conceived of and created in design schools and at IDEO. Retrieved from https://twitter.com/elizejackson/status/1110629818234818570?s=20.
[34]
Jeffrey L. Jenkins, Joseph S. Valacich, and Parker A. Williams. 2018. Human-Computer Interaction Movement Indicators of Response Biases in Online Surveys. In 38th International Conference on Information Systems: Transforming Society with Digital Innovation, ICIS 2017. Retrieved from https://arizona.pure.elsevier.com/en/publications/human-computer-interaction-movement-indicators-of-response-biases
[35]
Colin Jerolmack and Shamus Khan. 2014. Talk Is Cheap: Ethnography and the Attitudinal Fallacy. Sociological methods & research 43, 2: 178–209.
[36]
John M. Johnson and Timothy Rowlands. 2012. The interpersonal dynamics of in-depth interviewing. The SAGE handbook of interview research: The complexity of the craft: 99–113.
[37]
S. R. G. Jones. 1992. Was There a Hawthorne Effect? The American journal of sociology 98, 3: 451–468.
[38]
Sin-Hwa Kang and Jacquelyn Fort Morie. 2013. Users’ socially desirable responding with computer interviewers. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems (CHI EA ’13), 229–234.
[39]
Orit Karnieli-Miller, Roni Strier, and Liat Pessach. 2009. Power relations in qualitative research. Qualitative health research 19, 2: 279–289.
[40]
Timothy Kelley, L. Jean Camp, Suzanne Lien, and Douglas Stebila. 2012. Self-identified experts lost on the interwebs: the importance of treating all results as learning experiences. In Proceedings of the 2012 Workshop on Learning from Authoritative Security Experiment Results (LASER ’12), 47–54.
[41]
Ivar Krumpal. 2013. Determinants of social desirability bias in sensitive surveys: a literature review. Quality & quantity 47, 4: 2025–2047.
[42]
Calvin A. Liang, Sean A. Munson, and Julie A. Kientz. 2021. Embracing Four Tensions in Human-Computer Interaction Research with Marginalized Peopl. ACM transactions on computer-human interaction: a publication of the Association for Computing Machinery. Retrieved January 23, 2021 from https://calvliang.github.io/projects/TOCHI2021
[43]
Ann Chih Lin. 1998. Bridging positivist and interpretivist approaches to qualitative methods. Policy studies journal: the journal of the Policy Studies Organization 26, 1: 162–180.
[44]
D. Lupton. 2020. Doing fieldwork in a pandemic.
[45]
Kelly Mack, Emma McDonnell, Dhruv Jain, Lucy Lu Wang, Jon E. Froehlich, and Leah Findlater. 2021. What Do We Mean by “Accessibility Research”? A Literature Survey of Accessibility Papers in CHI and ASSETS from 1994 to 2019. arXiv [cs.HC]. https://doi.org/10.1145/3411764.3445412
[46]
Nora McDonald, Sarita Schoenebeck, and Andrea Forte. 2019. Reliability and Inter-rater Reliability in Qualitative Research: Norms and Guidelines for CSCW and HCI Practice. Proc. ACM Hum.-Comput. Interact. 3, CSCW: 1–23.
[47]
Alexander G. Mirnig, Magdalena Gärtner, Alexander Meschtscherjakov, and Manfred Tscheligi. 2020. Blinded by novelty: a reflection on participant curiosity and novelty in automated vehicle studies based on experiences from the field. In Proceedings of the Conference on Mensch und Computer (MuC ’20), 373–381.
[48]
Maletsabisa Molapo, Melissa Densmore, and Limpho Morie. 2016. Apps and Skits: Enabling New Forms of Village-To-Clinic Feedback for Rural Health Education. In Proceedings of the 7th Annual Symposium on Computing for Development (ACM DEV ’16), 1–10.
[49]
Torin Monahan and Jill A. Fisher. 2010. Benefits of “observer effects”: Lessons from the field. Qualitative research: QR 10, 3: 357–376.
[50]
Vivian Genaro Motti and Anna Evmenova. 2020. Designing Technologies for Neurodiverse Users: Considerations from Research Practice. In Human Interaction and Emerging Technologies, 268–274.
[51]
Anton J. Nederhof. 1985. Methods of coping with social desirability bias: A review. European journal of social psychology 15, 3: 263–280.
[52]
Lene Nielsen, Lars Rune Christensen, and Anne Sabers. 2017. Do we have to include HCI issues in clinical trials of medical devices? a discussion. In Proceedings of the 29th Australian Conference on Computer-Human Interaction (OZCHI ’17), 352–355.
[53]
Martin T. Orne. 1962. On the social psychology of the psychological experiment: With particular reference to demand characteristics and their implications. The American psychologist 17, 11: 776–783.
[54]
Delroy L. Paulhus. 1991. Measurement and control of response bias. Measures of personality and social psychological attitudes. 753: 17–59.
[55]
Stephen R. Porter, Michael E. Whitcomb, and William H. Weitzer. 2004. Multiple surveys of students and survey fatigue. New directions for institutional research 2004, 121: 63–73.
[56]
Matthew J. Salganik. 2017. Bit by Bit: Social Research in the Digital Age. Princeton University Press.
[57]
Andrew Sears and Vicki Hanson. 2011. Representing users in accessibility research. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’11), 2235–2238.
[58]
Laure M. Sharp and Joanne Frankel. 1983. Respondent Burden: A Test of Some Common Assumptions. Public opinion quarterly 47, 1: 36–53.
[59]
Eleanor Singer, John Van Hoewyk, Nancy Gebler, and Katherine McGonagle. 1999. The effect of incentives on response rates in interviewer-mediated surveys. Journal of official statistics 15, 2: 217.
[60]
Charlotte G. Steeh. 1981. Trends in Nonresponse Rates, 1952–1979. Public opinion quarterly 45, 1: 40–57.
[61]
The Inclusive Liz Jackson. 2019. A community response to a #DisabilityDongle. Medium. Retrieved October 21, 2021 from https://medium.com/@eejackson/a-community-response-to-a-disabilitydongle-d0a37703d7c2
[62]
Shari Trewin, Diogo Marques, and Tiago Guerreiro. 2015. Usage of Subjective Scales in Accessibility Research. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility (ASSETS ’15), 59–67.
[63]
Phil Turner and Susan Turner. 2009. Triangulation in practice. Virtual reality 13, 3: 171–181.
[64]
Aditya Vashistha, Fabian Okeke, Richard Anderson, and Nicola Dell. 2018. ’You Can Always Do Better!": The Impact of Social Proof on Participant Response Bias. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18), 1–13.
[65]
Katherine Welton. 1998. Nancy Hartsock's Standpoint Theory. Women & politics 18, 3: 7–24.
[66]
Mary Wickenden. 2011. Whose Voice is That?: Issues of Identity, Voice and Representation Arising in an Ethnographic Study of the Lives of Disabled Teenagers who use Augmentative and Alternative Communication (AAC). Disability studies quarterly: DSQ 31, 4. https://doi.org/10.18061/dsq.v31i4.1724
[67]
Parker A. Williams, Jeffrey Jenkins, Joseph Valacich, and Michael D. Byrd. 2017. Measuring Actual Behaviors in HCI Research – A call to Action and an Example. AIS Transactions on Human-Computer Interaction 9, 4: 339–352.
[68]
Rua M. Williams and Juan E. Gilbert. 2019. “Nothing About Us Without Us” Transforming Participatory Research and Ethics in Human Systems Engineering. In Advancing Diversity, Inclusion, and Social Justice Through Human Systems Engineering (1st ed.), Rod D. Roscoe, Erin K. Chiou and Abigail R. Wooldridge (eds.). CRC Press, Boca Raton: CRC Press, 2020., 113–134.
[69]
Christopher Winship and Robert D. Mare. 1992. Models for Sample Selection Bias. Annual Review of Sociology 18: 327–350.
[70]
Anon Ymous, Katta Spiel, Os Keyes, Rua M. Williams, Judith Good, Eva Hornecker, and Cynthia L. Bennett. 2020. “I am just terrified of my future” — Epistemic Violence in Disability Related Technology Research. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (CHI EA ’20), 1–16.

Cited By

View all
  • (2024)Mitigating Epistemic Injustice: The Online Construction of a Bisexual CultureACM Transactions on Computer-Human Interaction10.1145/364861431:4(1-34)Online publication date: 19-Sep-2024
  • (2024)Invisible, Unreadable, and Inaudible Cookie Notices: An Evaluation of Cookie Notices for Users with Visual ImpairmentsACM Transactions on Accessible Computing10.1145/364128117:1(1-39)Online publication date: 18-Mar-2024
  • (2023)Participatory Design of Virtual Humans for Mental Health Support Among North American Computer Science Students: Voice, Appearance, and the Similarity-attraction EffectACM Transactions on Applied Perception10.1145/361396120:3(1-27)Online publication date: 20-Sep-2023
  • Show More Cited By

Index Terms

  1. Accept or Address? Researchers’ Perspectives on Response Bias in Accessibility Research
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        ASSETS '21: Proceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility
        October 2021
        730 pages
        ISBN:9781450383066
        DOI:10.1145/3441852
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 17 October 2021

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. Response bias
        2. accessibility dongle
        3. charity model of disability
        4. participant-researcher power dynamics

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Conference

        ASSETS '21
        Sponsor:

        Acceptance Rates

        ASSETS '21 Paper Acceptance Rate 36 of 134 submissions, 27%;
        Overall Acceptance Rate 436 of 1,556 submissions, 28%

        Upcoming Conference

        ASSETS '25

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)168
        • Downloads (Last 6 weeks)22
        Reflects downloads up to 14 Feb 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Mitigating Epistemic Injustice: The Online Construction of a Bisexual CultureACM Transactions on Computer-Human Interaction10.1145/364861431:4(1-34)Online publication date: 19-Sep-2024
        • (2024)Invisible, Unreadable, and Inaudible Cookie Notices: An Evaluation of Cookie Notices for Users with Visual ImpairmentsACM Transactions on Accessible Computing10.1145/364128117:1(1-39)Online publication date: 18-Mar-2024
        • (2023)Participatory Design of Virtual Humans for Mental Health Support Among North American Computer Science Students: Voice, Appearance, and the Similarity-attraction EffectACM Transactions on Applied Perception10.1145/361396120:3(1-27)Online publication date: 20-Sep-2023
        • (2023)Critical-Reflective Human-AI Collaboration: Exploring Computational Tools for Art Historical Image RetrievalProceedings of the ACM on Human-Computer Interaction10.1145/36100547:CSCW2(1-33)Online publication date: 4-Oct-2023
        • (2023)Large-Scale Anonymized Text-based Disability Discourse DatasetProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3614476(1-5)Online publication date: 22-Oct-2023
        • (2023)Integrating Artificial Intelligence with Customer Experience in Banking: An Empirical Study on how Chatbots and Virtual Assistants Enhance Empathy2023 International Conference on Computing, Networking, Telecommunications & Engineering Sciences Applications (CoNTESA)10.1109/CoNTESA61248.2023.10384979(69-74)Online publication date: 14-Dec-2023
        • (2022)Data-Triangulation Through Multiple MethodsHandbook of Research on Digital-Based Assessment and Innovative Practices in Education10.4018/978-1-6684-2468-1.ch005(90-115)Online publication date: 6-May-2022
        • (2022)Elevating strengths and capacitiesInteractions10.1145/354906829:5(28-33)Online publication date: 30-Aug-2022
        • (2022)Toward Inclusion and Accessibility in Visualization Research: Speculations on Challenges, Solution Strategies, and Calls for Action (Position Paper)2022 IEEE Evaluation and Beyond - Methodological Approaches for Visualization (BELIV)10.1109/BELIV57783.2022.00007(20-27)Online publication date: Oct-2022

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format.

        HTML Format

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media