ABSTRACT
Optimism about our ability to enhance societal decision-making by leaning on Machine Learning (ML) for cheap, accurate predictions has palled in recent years, as these ‘cheap’ predictions have come at significant social cost, contributing to systematic harms suffered by already disadvantaged populations. But what precisely goes wrong when ML goes wrong? We argue that, as well as more obvious concerns about the downstream effects of ML-based decision-making, there can be moral grounds for the criticism of these predictions themselves. We introduce and defend a theory of predictive justice, according to which differential model performance for systematically disadvantaged groups can be grounds for moral criticism of the model, independently of its downstream effects. As well as helping resolve some urgent disputes around algorithmic fairness, this theory points the way to a novel dimension of epistemic ethics, related to the recently discussed category of doxastic wrong. The full version of this paper is available at http://mintresearch.org/pj.
Index Terms
- On the Site of Predictive Justice
Recommendations
Disambiguating Algorithmic Bias: From Neutrality to Justice
AIES '23: Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and SocietyAs algorithms have become ubiquitous in consequential domains, societal concerns about the potential for discriminatory outcomes have prompted urgent calls to address algorithmic bias. In response, a rich literature across computer science, law, and ...
Intuition Pumps
The award of the 2003 Barwise Prize to Daniel Dennett by the American Philosophical Association signifies Dennett's importance in the developing area of philosophical inquiry into computing and information. One source of Dennett's intellectual stature ...
Measuring justice in machine learning
FAT* '20: Proceedings of the 2020 Conference on Fairness, Accountability, and TransparencyHow can we build more just machine learning systems? To answer this question, we need to know both what justice is and how to tell whether one system is more or less just than another. That is, we need both a definition and a measure of justice. ...
Comments