skip to main content
10.1145/3442188.3445942acmconferencesArticle/Chapter ViewAbstractPublication PagesfacctConference Proceedingsconference-collections
abstract

The Algorithmic Leviathan: Arbitrariness, Fairness, and Opportunity in Algorithmic Decision Making Systems

Published: 01 March 2021 Publication History

Abstract

Automated decision-making systems implemented in public life are typically standardized. One algorithmic decision-making system can replace thousands of human deciders. Each of the humans so replaced had her own decision-making criteria: some good, some bad, and some arbitrary. Is such arbitrariness of moral concern?
We argue that an isolated arbitrary decision need not morally wrong the individual whom it misclassifies. However, if the same algorithms are applied across a public sphere, such as hiring or lending, a person could be excluded from a large number of opportunities. This harm persists even when the automated decision-making systems are "fair" on standard metrics of fairness. We argue that such arbitrariness at scale is morally problematic and propose technically informed solutions that can lessen the impact of algorithms at scale and so mitigate or avoid the moral harms we identify.

Cited By

View all
  • (2024)Algorithmic Arbitrariness in Content ModerationProceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency10.1145/3630106.3659036(2234-2253)Online publication date: 3-Jun-2024
  • (2024)The Digital Faces of Oppression and Domination: A Relational and Egalitarian Perspective on the Data-driven Society and its RegulationProceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency10.1145/3630106.3658934(701-712)Online publication date: 3-Jun-2024
  • (2024)Competing narratives in AI ethics: a defense of sociotechnical pragmatismAI & SOCIETY10.1007/s00146-024-02128-2Online publication date: 27-Dec-2024
  • Show More Cited By

Index Terms

  1. The Algorithmic Leviathan: Arbitrariness, Fairness, and Opportunity in Algorithmic Decision Making Systems

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        FAccT '21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency
        March 2021
        899 pages
        ISBN:9781450383097
        DOI:10.1145/3442188
        Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 01 March 2021

        Check for updates

        Author Tags

        1. algorithmic decision making
        2. arbitrariness
        3. automated hiring
        4. fairness
        5. machine learning
        6. opportunity

        Qualifiers

        • Abstract
        • Research
        • Refereed limited

        Conference

        FAccT '21
        Sponsor:

        Upcoming Conference

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)22
        • Downloads (Last 6 weeks)2
        Reflects downloads up to 05 Mar 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Algorithmic Arbitrariness in Content ModerationProceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency10.1145/3630106.3659036(2234-2253)Online publication date: 3-Jun-2024
        • (2024)The Digital Faces of Oppression and Domination: A Relational and Egalitarian Perspective on the Data-driven Society and its RegulationProceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency10.1145/3630106.3658934(701-712)Online publication date: 3-Jun-2024
        • (2024)Competing narratives in AI ethics: a defense of sociotechnical pragmatismAI & SOCIETY10.1007/s00146-024-02128-2Online publication date: 27-Dec-2024
        • (2023)Unlearning Descartes: Sentient AI is a Political ProblemJournal of Social Computing10.23919/JSC.2023.00204:3(193-204)Online publication date: Sep-2023
        • (2023)The Devil is in the Details: Interrogating Values Embedded in the Allegheny Family Screening ToolProceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency10.1145/3593013.3594081(1292-1310)Online publication date: 12-Jun-2023
        • (2023)Gender Artifacts in Visual Datasets2023 IEEE/CVF International Conference on Computer Vision (ICCV)10.1109/ICCV51070.2023.00446(4814-4825)Online publication date: 1-Oct-2023
        • (2023)Auditing Practitioner Judgment for Algorithmic Fairness Implications2023 IEEE International Symposium on Ethics in Engineering, Science, and Technology (ETHICS)10.1109/ETHICS57328.2023.10154992(01-05)Online publication date: 18-May-2023
        • (2023)Fairness as adequacy: a sociotechnical view on model evaluation in machine learningAI and Ethics10.1007/s43681-023-00280-x4:2(427-440)Online publication date: 12-Apr-2023
        • (2023)(Some) algorithmic bias as institutional biasEthics and Information Technology10.1007/s10676-023-09698-725:2Online publication date: 21-Mar-2023
        • (2023)Foundation and large language models: fundamentals, challenges, opportunities, and social impactsCluster Computing10.1007/s10586-023-04203-727:1(1-26)Online publication date: 27-Nov-2023
        • Show More Cited By

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media