Abstract
Mark Alfano claims that the heuristics and biases literature supports inferential cognitive situationism, i.e., the view that most of our inferential beliefs are arrived at and retained by means of unreliable heuristics rather than intellectual virtues. If true, this would present virtue reliabilists with an unpleasant choice: they can either accept inferential skepticism, or modify or abandon reliabilism. Alfano thinks that the latter course of action is most plausible, and several reliabilists seem to agree. I argue that this is not the case. If situationism is true, then inferential non-skepticism is no more plausible than reliabilism. But inferential cognitive situationism is false. The heuristic-based inferences that facilitate successful perception and communication have proven remarkably accurate, and even the psychological research on inductive reasoning does not support Alfano’s situationism. More generally, negative assessments of human reasoning tend to ignore the fact that the research on cognitive biases focuses primarily on the performance of individuals in isolation. Several studies suggest that we reason much more effectively when in critical dialogue with others, which highlights the fact that our epistemic performance depends not only on the inner workings of our cognitive processes, but on the environments in which they operate.
Similar content being viewed by others
Notes
Alfano then proceeds to voice his own reservations concerning this reaction to epistemic situationism.
Different forms of radical skepticism concern different domains: while Pyrrhonian skeptics deny the possibility of any knowledge whatsoever, Cartesian skeptics deny knowledge of the external world.
This formulation of inferential skepticism is highly implausible. Given a weak principle of epistemic closure, we can be said to know a tremendous amount inferentially. I know, for example, that because it is the year 2018, it is not the year 2017, 2016, 2015, etc. A more plausible formulation of inferential skepticism, and one in keeping with Alfano’s concerns about our use of heuristics, would exclude these sorts of deductively closed trivial truths. This is a much weaker doctrine than any form of radical skepticism. I am thankful to Jon Marsh for pointing this out to me.
Alternatively, one might want to classify this belief as the result of intuition rather than inference. This re-classification does nothing to address my objection, however, since the heuristics and biases literature that Alfano relies on to make his case for ICS may be used to call the reliability of this intuition into doubt, as I do in the remainder of this paragraph.
For an important collection of papers on stereotype accuracy, see Lee et al. (1995).
This move parallels Gigerenzer’s move to an ecological conception of rationality, which “…refers to the study of how cognitive strategies exploit the representation and structure of information in the environment to make reasonable judgments and decisions” (Gigerenzer 2000, p. 57).
For a clear delineation of the reckoning and response theories of inference, see Siegel (2017, Ch. 5).
In fairness to Alfano and Gould, Kahneman and Tversky themselves drew this conclusion from their early work:
In making predictions and judgments under uncertainty, people do not appear to follow the calculus of chance or the statistical theory of prediction. Instead, they rely on a limited number of heuristics which sometimes yield reasonable judgments and sometimes lead to severe and systematic errors (Kahneman and Tversky 1973, p. 273).
This line of argument constitutes what Carter and Pritchard (2017) call bias-driven skepticism. They find it not only in Alfano’s work on situationism, but in Saul’s claim that “…what we know about implicit biases shows us that we have very good reason to believe that we cannot properly trust our knowledge-seeking faculties” (2013, p. 243).
Two cautionary points are worth emphasizing. First, it remains to be empirically established that confirmation bias is sufficiently ubiquitous to pose a threat to our inferential cognition generally. Second, confirmation bias has a possible upside: while it sometimes prevents us from abandoning false beliefs, it can also decrease our chances of abandoning true beliefs. Thus, the existence of confirmation bias can be used to bolster ICS only if there are independent grounds for thinking that our inferential processes produce a significant number of false beliefs. Even if both of these claims can be established, however, there are reasons to be dubious of the situationist’s pessimistic conclusion, as I will argue below.
‘Superforecasters’ is Tetlock’s term for individuals who outperform the vast majority of forecasters. Tetlock found similar results more generally, i.e. with regular forecasters as well: “At the end of the [first] year, the results were unequivocal: on average, teams were 23% more accurate than individuals” (Tetlock and Gardner 2015, p. 201).
They are also what Morton (2012) calls, more broadly, paradoxical virtues.
References
Alfano, M. (2012). Extending the situationist challenge to responsibilist virtue epistemology. Philosophical Quarterly, 62(247), 223–249.
Alfano, M. (2014). Extending the situationist challenge to reliabilism about inference. In A. Fairweather (Ed.), Virtue epistemology naturalized (pp. 103–122). Dordrecht: Springer.
Alfano, M. (2017). Epistemic situationism: An extended prolepsis. In A. Fairweather & M. Alfano (Eds.), Epistemic situationism (pp. 44–61). Oxford: Oxford University Press.
Baron, R. S., Hoppe, S. I., Kao, C. F., Brunsman, B., Linneweh, B., & Rogers, D. (1996). Social corroboration and opinion extremity. Journal of Experimental Social Psychology, 32, 537–560.
Bishop, M., & Trout, J. D. (2002). 50 Years of successful predictive modeling should be enough: lessons for philosophy of science. Philosophy of Science: PSA 2000 Symposium Papers 69, 69, 197–208.
Carroll, J., Winer, R., Coates, D., Galegher, J., & Alibrio, J. (1988). Evaluation, diagnosis, and prediction in parole decision-making. Law and Society Review, 17, 199–228.
Carter, J. A., & Pritchard, D. (2017). Cognitive bias, scepticism and understanding. In S. R. Grimm, C. Baumberger, & S. Ammon (Eds.), Explaining understanding: New perspectives from epistemology and philosophy of science (pp. 272–292). New York: Routledge.
Davidson, D. (1986). A coherence theory of truth and knowledge. In E. LePore (Ed.), Truth and interpretation: Perspectives on the philosophy of Donald Davidson (pp. 307–319). Oxford: Basil Blackwell.
Fairweather, A., & Montemayor, C. (2017). Knowledge, dexterity, and attention: A theory of epistemic agency. Cambridge: Cambridge University Press.
Fischhoff, B. (1982). Debiasing. In D. Kahneman, P. Slovic, & A. Tversky (Eds.), Judgement under uncertainty: Heuristics and biases (pp. 422–444). Cambridge: Cambridge University Press.
Gigerenzer, G. (2000). Adaptive thinking: Rationality in the real world. Oxford: Oxford University Press.
Gigerenzer, G. (2007). Gut feelings: The intelligence of the unconscious. London: Penguin Books.
Gigerenzer, G., & Hoffrage, U. (1995). How to improve Bayesian reasoning without instruction: Frequency formats. Psychological Review, 102, 684–704.
Gould, S. J. (1992). Bully for brontosaurus: Further reflections in natural history. New York: Penguin Books.
Hertwig, R., & Gigerenzer, G. (1999). The “conjunction fallacy” revisited: How intelligent inferences look like reasoning error. Journal of Behavioral Decision Making, 12, 275–305.
Hoffrage, U. (2004). Overconfidence. In R. Pohl (Ed.), Cognitive illusions: A handbook on fallacies and biases in thinking, judgment and memory (pp. 235–254). Hove: Psychology Press.
Kahneman, D. (2011). Thinking fast and slow. London: Penguin Books.
Kahneman, D., & Tversky, A. (1973). On the psychology of prediction. Psycholoigcal Review, 80, 237–251.
Kenyon, T., & Beaulac, G. (2014). Critical thinking education and debiasing. Informal Logic, 34(4), 341–363.
Lee, Y., Jussim, L. J., & McCauley, C. R. (1995). Stereotype accuracy: Toward appreciating group differences. Washington, D.C.: American Psychological Association.
Lehman, D. R., Lempert, R. O., & Nisbett, R. (1988). The effects of graduate training on reasoning: Formal discipline and thinking about everyday-life events. American Psychologist, 43(6), 431–442.
Meehl, P. (1954). Clinical versus statistical prediction: A theoretical analysis and a review of the evidence. Minneapolis: University of Minnesota Press.
Mercier, H., & Sperber, D. (2011). Why do humans reason? Arguments for an argumentative theory. Behavioral and Brain Sciences, 34(2), 57–74.
Mercier, H., & Sperber, D. (2017). The enigma of reason. Cambridge: Harvard University Press.
Morton, A. (2012). Bounded thinking: Intellectual virtues for limited agents. Oxford: Oxford University Press.
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomena in many guises. Review of General Psychology, 2, 175–220.
Nisbett, R., Fong, G. T., Lehman, D. R., & Cheng, P. W. (1987). Teaching reasoning. Science, 238(4827), 625–631.
Nisbett, R., Krantz, D. H., Jepson, C., & Kunda, Z. (2002). The use of statistical heuristics in everyday inductive reasoning. In T. Gilovich, D. W. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 510–533). Cambridge: Cambridge University Press.
O’Brien, B. (2009). Prime suspect: An examination of factors that aggravate and counteract confirmation bias in criminal investigations. Psychology, Public Policy, and Law, 15(4), 315–334.
Pinker, S. (1997). How the mind works. New York: W. W. Norton & Company.
Pronin, E., Lin, D., & Ross, L. (2002). The bias blind spot: Perceptions of bias in self versus others. Personality and Social Psychology Bulletin, 28, 369–381.
Saul, J. (2013). Scepticism and implicit bias. Disputatio, 5(37), 243–263.
Sedikides, C., & Gregg, A. P. (2007). They why’s the limit: Curtailing self-enhancement with explanatory introspection. Journal of Personality, 75, 783–824.
Siegel, S. (2017). The rationality of perception. Oxford: Oxford University Press.
Smart, P. R. (2018). Mandevillian intelligence. Synthese, 195, 4169–4200.
Sosa, E. (2017). Virtue theory against situationism. In A. Fairweather & M. Alfano (Eds.), Epistemic situationism (pp. 116–134). Oxford: Oxford University Press.
Stasser, G., & Titus, W. (2003). Hidden profiles: A brief history. Psychological Inquiry, 14, 204–313.
Tetlock, P. (2005). Expert political judgment. Princeton: Princeton University Press.
Tetlock, P., & Gardner, D. (2015). Superforecasting: The art and science of prediction. Toronto: Signal.
Turri, J. (2017). Epistemic situationism and cognitive ability. In A. Fairweather & M. Alfano (Eds.), Epistemic situationism (pp. 158–167). Oxford: Oxford University Press.
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5, 207–232.
Tversky, A., & Kahneman, D. (2002). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. In T. Gilovich, D. W. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 19–48). Cambridge: Cambridge University Press.
Wilson, T. D., Centerbar, D. B., & Brekke, N. (2002). Mental contamination and the debiasing problem. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 185–200). Cambridge: Cambridge University Press.
Acknowledgements
I am grateful to an audience at the University of Glasgow for their discussion of an earlier version of this paper. I owe a special note of thanks to Jon Marsh and two of this journal’s referees for their insightful, detailed, and constructive comments.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Bland, S. Cognitive bias, situationism, and virtue reliabilism. Synthese 198, 471–490 (2021). https://doi.org/10.1007/s11229-018-02031-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11229-018-02031-6