Abstract
Precautionary Principles are often said to be appropriate for decision-making in contexts of uncertainty such as climate policy. Contexts of uncertainty are contrasted to contexts of risk depending on whether we have probabilities or not. Against this view, I argue that the risk-uncertainty distinction is practically irrelevant. I start by noting that the history of the distinction between risk and uncertainty is more varied than is sometimes assumed. In order to examine the distinction, I unpack the idea of having probabilities, in particular by distinguishing three interpretations of probability: objective, epistemic, and subjective probability. I then claim that if we are concerned with whether we have probabilities at all—regardless of how low their epistemic credentials are—then we almost always have probabilities for policy-making. The reason is that subjective and epistemic probability are the relevant interpretations of probability and we almost always have subjective and epistemic probabilities. In contrast, if we are only concerned with probabilities that have sufficiently high epistemic credentials, then we obviously do not always have probabilities. Climate policy, for example, would then be a case of decision-making under uncertainty. But, so I argue, we should not dismiss probabilities with low epistemic credentials. Rather, when they are the best available probabilities our decision principles should make use of them. And, since they are almost always available, the risk-uncertainty distinction remains irrelevant.
Similar content being viewed by others
Notes
In Rawls’ original position people decide on the fundamental principles of justice behind a veil of ignorance. This veil removes the knowledge about the probability of having certain personal characteristics or the historical circumstances in which one lives.
In the Ellsberg case one bets on drawing a red ball from an urn which contains 100 red and black balls in an unkown ratio. The interesting point consists in comparing this to betting on drawing a red ball from an urn of which it is known that it contains 50 red balls and 50 black balls.
However, even Keynes (1937, p. 214) said that in situations of uncertainty we act as if we based our decisions on probabilities.
The following exposition draws much on Mellor’s (2005) introduction.
Confusingly, Weitzman (2007, p. 717) seems to make precisely this claim.
The expression “epistemic credentials of probabilities” is a shorthand for two things: First, it refers to the credentials of the epistemic access to the probabilities: Are our beliefs about the probabilities well-grounded? Second—and this applies only to the case of subjective probabilities—it refers to the credentials of the probabilities themselves: Are our subjective probabilities—i.e. our degrees of belief—well-grounded?
In terms of a side remark, note that if objective probability were the relevant understanding of probability and if objective probability never existed, this would also support the claim that the risk-uncertainty distinction is no sensible basis for Precautionary Principles. The suggestion that we never have probabilities stands in contrast to this article’s claim that we always have probabilities. In either case, the risk-uncertainty distinction would be useless.
This point can also be expressed by saying that action-guiding principles provide us with a subjective ought rather than an objective ought.
Features should here be understood as a more ecumenical term than consequences.
The instruction to match degrees of belief to the evidence could either be conceived of as part of the decision-making principle or it could be conceived of as an additional moral or epistemic responsibility (and which is a condition for justifiably applying the decision-making principle). On this issue, see also Mintz-Woo (2017, pp. 95–96).
Here I assume an evidentialist epistemology: degrees of belief that p ought to correspond to the support given for p by the evidence. It is my hope that the conclusions of this article would also hold (mutatis mutandis) if we allow for plausible amendments to evidentialism.
If one places sufficiently high demands on probabilities, then one can of course also come to the conclusion that we are never, apart from textbook cases, in a context of risk but always in a context of uncertainty (Hansson 2009, p. 426).
These points are much indebted to Kian Mintz-Woo’s comments.
Thus, proponents of Maximin do not have to deny that decision-making principles process degrees of belief which should correspond to the support given by the evidence. But they claim that we should not rely on degrees of belief about what will happen in the future but rather about what is possible to happen in the future.
An additional problem is that policy advice would depend crucially on how probable a consequence needs to be before it can count as “realistic" even though there are not many criteria that would help us set the necessary level of probability for a consequence to count as “realistic” at one point or another.
Admittedly, my reasoning is not directly relevant for policy-making since I have not taken into account such constraints as the inaptitude of the human brain to process probabilities sensibly, the pitfalls of communication between scientists and policy-makers etc. Taking such constraints into account might yield the result that promoting a Precautionary Principle based on the risk-uncertainty distinction (e.g. Maximin) has the right effects in the real world. However, it would then be a rough-and-ready rule of thumb whose public acceptance is instrumentally useful rather than a principle that captures the correct rationale for precautionary policymaking.
References
Al-Najjar, N., & Weinstein, J. (2015). A Bayesian model of Knightian uncertainty. Theory and Decision, 78(1), 1–22.
Arrow, K. (1951). Alternative approaches to the theory of choice in risk-taking situations. Econometrica, 19(4), 404–437.
Betz, G. (2010). What’s the worst case? The methodology of possibilistic prediction. Analyse und Kritik, 32(1), 87–106.
Bognar, G. (2011). Can the maximin principle serve as a basis for climate change policy? The Monist, 94(3), 329–348.
Bostrom, N., & Ćirković, M. (2008). Introduction. In N. Bostrom & M. Ćirković (Eds.), Global catastrophic risks (pp. 1–30). Oxford: Oxford University Press.
Broome, J. (2007). Valuing policies in response to climate change: Some ethical issues. http://webarchive.nationalarchives.gov.uk/20130129110402/http://www.hm-treasury.gov.uk/d/stern_review_supporting_technical_material_john_broome_261006.pdf. Accessed February 6, 2016.
De Finetti, B. (1974). Theory of probability. New York: Wiley.
Ellsberg, D. (1961). Risk, ambiguity, and the savage axioms. The Quarterly Journal of Economics, 75(4), 643–669.
Elster, J. (1983). Explaining technical change. Cambridge: Cambridge University Press.
Epstein, L., & Zhang, J. (2001). Subjective probabilities on subjectively unambiguous events. Econometrica, 69(2), 265–306.
Friedman, M. (1976). Price theory: A provisional text. Chicago: Aldine.
Gärdenfors, P., & Sahlin, N. (1983). Decision making with unreliable probabilities. British Journal of Mathematical and Statistical Psychology, 36(2), 240–251.
Gardiner, S. (2006). A core precautionary principle. The Journal of Political Philosophy, 14(1), 33–60.
Gardiner, S. (2010). Ethics and global climate change. In S. Gardiner, S. Caney, D. Jamieson, & H. Shue (Eds.), Climate ethics: Essential readings (pp. 146–162). Oxford: Oxford University Press.
Hacking, I. (2001). An introduction to probability and inductive logic. Cambridge: Cambridge University Press.
Hájek, A. (2012). Interpretations of probability. In E. Zalta (Ed.), The Stanford encyclopedia of philosophy (Winter 2012 Edition). http://plato.stanford.edu/archives/win2012/entries/probability-interpret. Accessed February 9, 2016.
Hansson, S. O. (2009). From the casino to the jungle. Synthese, 168(3), 423–432.
Harsanyi, J. (1976). Essays on ethics, social behaviour, and scientific explanation. Dordrecht: Reidel.
Hirshleifer, J., & Riley, J. (1992). The analytics of uncertainty and information. Cambridge: Cambridge University Press.
IPCC. (2014). Climate change 2014: Mitigation of climate change. Contribution of working group III to the fifth assessment report of the intergovernmental panel on climate change. Cambridge: Cambridge University Press.
Kelsey, D., & Quiggin, J. (1992). Theories of choice under ignorance and uncertainty. Journal of Economic Surveys, 6(2), 133–153.
Keynes, J. M. (1937). The general theory of employment. The Quarterly Journal of Economics, 51(2), 209–223.
Knight, F. (1921). Risk, uncertainty and profit. Boston: Houghton Mifflin.
Kuhn, K. (1997). Communicating uncertainty: Framing effects on responses to vague probabilities. Organizational Behavior and Human Decision Processes, 71(1), 55–83.
Langlois, R., & Cosgel, M. (1993). Frank Knight on risk, uncertainty, and the firm: A new interpretation. Economic Inquiry, 31(3), 456–465.
LeRoy, S., & Singell, L. (1987). Knight on risk and uncertainty. Journal of Political Economy, 95(2), 394–406.
Luce, R., & Raiffa, H. (1957). Games and decisions. Introduction and critical survey. New York: Dover Publications.
Machina, M., & Siniscalchi, M. (2013). Ambiguity and ambiguity aversion. In M. Machina & K. Viscusi (Eds.), Handbook of the economics of risk and uncertainty. Amsterdam: North-Holland.
Mas-Colell, A., Whinston, M., & Green, J. (1995). Microeconomic theory. New York: Oxford University Press.
McCann, C. (2003). Probability foundations of economic theory. London: Routledge.
McKinnon, C. (2009). Runaway climate change: A justice based case for precautions. Journal of Social Philosophy, XL, 187–204.
Mellor, D. (2005). Probability: A philosophical introduction. Abingdon: Routledge.
Mintz-Woo, Kian. (2017). A new defence of probability discounting. In A. Walsh, S. Hormio, & D. Purves (Eds.), The ethical underpinnings of climate economics (pp. 87–102). London: Routledge.
Moellendorf, D. (2014). The moral challenge of dangerous climate change: Values, poverty, and policy. Cambridge: Cambridge University Press.
Peterson, M. (2006). The precautionary principle is incoherent. Risk Analysis, 26(3), 595–601.
Peterson, M. (2009). An introduction to decision theory. Cambridge: Cambridge University Press.
Plantinga, A. (1993). Warrant and proper function. Oxford: Oxford University Press.
Posner, R. (2004). Catastrophe: Risk and response. Oxford: Oxford University Press.
Rawls, J. (1999). A theory of justice (Revised ed.). Cambridge, MA: Harvard University Press.
Resnik, D. (2003). Is the precautionary principle unscientific? Studies in the History and Philosophy of Biology and the Biomedical Sciences, 34, 329–344.
Roser, D., & Seidel, C. (2017). Climate justice: An introduction. London: Routledge.
Runde, J. (1998). Clarifying Frank Knight’s discussion of the meaning of risk and uncertainty. Cambridge Journal of Economics, 22(5), 539–546.
Shue, H. (2010). Deadly delays, saving opportunities. In S. Gardiner, S. Caney, D. Jamieson, & H. Shue (Eds.), Climate ethics: Essential readings (pp. 146–162). Oxford: Oxford University Press
Steel, D. (2014). Philosophy and the precautionary principle. Cambridge: Cambridge University Press.
Stern, N. (2007). The economics of climate change: The stern review. Cambridge: Cambridge University Press.
Sunstein, C. (2005). Cost-benefit analysis and the environment. Ethics, 115(2), 351–385.
Tosun, J. (2013). How the EU handles uncertain risks: Understanding the role of the precautionary principle. Journal of European Public Policy, 20(10), 1517–1528.
Vardas, G., & Xepapadeas, A. (2010). Model uncertainty, ambiguity and the precautionary principle: Implications for biodiversity management. Environmental & Resource Economics, 45(3), 379–404.
Weitzman, M. (2007). A review of the stern review on the economics of climate change. Journal of Economic Literature, 45(3), 703–724.
Wigley, T., & Raper, S. (2001). Interpretation of high projections for global-mean warming. Science, 293, 451–454.
Zeckhauser, R. (2013). New frontiers beyond risk and uncertainty: Ignorance, group decision, and unanticipated themes. In M. Machina & K. Viscusi (Eds.), Handbook of the economics of risk and uncertainty (pp. xvii–xxix). Amsterdam: North-Holland.
Acknowledgements
I thank Claus Beisbart, Gregor Betz, Dave Frame, Conrad Heilmann, Kian Mintz-Woo, David Moss, Martin Peterson, Matthew Rendall, Christian Seidel, various audiences, and an anonymous referee for valuable help.
Funding
Funding was provided by Nanjing University of Information Science and Technology and University of Fribourg.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Roser, D. The Irrelevance of the Risk-Uncertainty Distinction. Sci Eng Ethics 23, 1387–1407 (2017). https://doi.org/10.1007/s11948-017-9919-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11948-017-9919-x