Skip to main content
Log in

Deceptive updating and minimal information methods

  • Published:
Synthese Aims and scope Submit manuscript

Abstract

The technique of minimizing information (infomin) has been commonly employed as a general method for both choosing and updating a subjective probability function. We argue that, in a wide class of cases, the use of infomin methods fails to cohere with our standard conception of rational degrees of belief. We introduce the notion of a deceptive updating method and argue that non-deceptiveness is a necessary condition for rational coherence. Infomin has been criticized on the grounds that there are no higher order probabilities that ‘support’ it, but the appeal to higher order probabilities is a substantial assumption that some might reject. Our elementary arguments from deceptiveness do not rely on this assumption. While deceptiveness implies lack of higher order support, the converse does not, in general, hold, which indicates that deceptiveness is a more objectionable property. We offer a new proof of the claim that infomin updating of any strictly-positive prior with respect to conditional-probability constraints is deceptive. In the case of expected-value constraints, infomin updating of the uniform prior is deceptive for some random variables but not for others. We establish both a necessary condition and a sufficient condition (which extends the scope of the phenomenon beyond cases previously considered) for deceptiveness in this setting. Along the way, we clarify the relation which obtains between the strong notion of higher order support, in which the higher order probability is defined over the full space of first order probabilities, and the apparently weaker notion, in which it is defined over some smaller parameter space. We show that under certain natural assumptions, the two are equivalent. Finally, we offer an interpretation of Jaynes, according to which his own appeal to infomin methods avoids the incoherencies discussed in this paper.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  • de Finetti B. (1974) Theory of probability (Vol. 1). John Wiley and Sons, New York, NY

    Google Scholar 

  • Friedman K., Shimony A. (1971) Jaynes’s maximum entropy prescription and probability theory. Journal of Statistical Physics 3(4): 381

    Article  Google Scholar 

  • Gaifman H. (1983) Paradoxes of infinity and self-applications, I’. Erkenntnis 20: 131–155

    Article  Google Scholar 

  • Gaifman H. (1986) A theory of higher order probabilities. In: Halpern J. (eds) Theoretical aspects of reasoning about knowledge. Morgan Kaufmann Publishers Inc, San Francisco, CA

    Google Scholar 

  • Gaifman H. (2004) Reasoning with limited resources and assigning probabilities to arithmetical statements. Synthese 140: 97–119

    Article  Google Scholar 

  • Gaifman H., Snir M. (1982) Probabilities over rich languages, testing and randomness. Journal of Symbolic Logic 47(3): 495–548

    Article  Google Scholar 

  • Good I. J. (1972) 46,656 varieties of Bayesians. American Statistician 25: 62–63

    Google Scholar 

  • Grove, A., & Halpern, J. (1997). Probability update: Conditioning vs. cross entropy. In: Proceedings of the thirteenth annual conference on uncertainty in artificial intelligence.

  • Hobson A., Cheng B. (1973) A comparison of the Shannon and Kullback information measures. Journal of Statistical Physics 7(4): 301–310

    Article  Google Scholar 

  • Jaynes E. T. (1957) Information theory and statistical mechanics, 1’. Physical Review 106: 620–630

    Article  Google Scholar 

  • Jaynes E. T. (1968) Prior probabilities. In: Rosenkrantz R. (eds) E.T. Jaynes: Papers on probability, statistics and statistical physics. D. Reidel Publishing Company, Boston, MA, pp 116–130

    Google Scholar 

  • Jaynes E. T. (1983) Where do we stand on maximum entropy. In: Rosenkrantz R. (eds) E.T. Jaynes: Papers on probability, statistics and statistical physics. D. Reidel Publishing Company, Boston, MA, pp 210–314

    Google Scholar 

  • Jaynes E. T. (2003) Probability theory: The logic of science. Cambridge University Press, Cambridge

    Book  Google Scholar 

  • Jeffrey R. (1965) The logic of decision. McGraw Hill, New York

    Google Scholar 

  • Keynes J.M. (1920) A treatise on probability, 2006 edn. Cosimo, Inc., New York, NY

    Google Scholar 

  • Kullback S., Leibler R. (1951) On information and sufficiency. Annals of Mathematical Statistics 22(1): 79–86

    Article  Google Scholar 

  • Levi I. (1985) Imprecision and indeterminacy in probability judgment. Philosophy of Science 52(3): 390–409

    Article  Google Scholar 

  • Paris J. (1998) Common sense and maximum entropy. Synthese 117(1): 75–93

    Article  Google Scholar 

  • Paris J., Vencovská A. (1997) In defense of the maximum entropy inference process. International Journal of Approximate Reasoning 17(1): 77–103

    Article  Google Scholar 

  • Putnam H. (1963) Degree of confirmation and inductive logic. In: Schilpp P. (eds) The philosophy of Rudolf Carnap. The Open Court Publishing Co, La Salle, IL, pp 761–784

    Google Scholar 

  • Savage L. (1954) The foundations of statistics. John Wiley and Sons, New York

    Google Scholar 

  • Seidenfeld T. (1979) Why I am not an objective Bayesian. Theory and Decision 11: 413–440

    Article  Google Scholar 

  • Seidenfeld T. (1987) Entropy and uncertainty (revised). In: MacNeill I., Humphreys G. (eds) Foundations of statistical inference. D. Reidel Publishing Co, Dordrecht, pp 259–287

    Google Scholar 

  • Shannon C.E. (1948) A mathematical theory of communication. The Bell System Technical Journal 27: 379–423

    Google Scholar 

  • Shimony A. (1973) Comment on the interpretation of inductive probabilities. Journal of Statistical Physics 9(2): 187–191

    Article  Google Scholar 

  • Shimony A. (1985) The status of the principle of maximum entropy. Synthese 63: 35–53

    Article  Google Scholar 

  • Shore J., Johnson R. (1980) Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions on Information Theory IT-26(1): 26–37

    Article  Google Scholar 

  • van Fraassen B. (1981) A problem for relative information minimizers in probability kinematics. The British Journal for the Philosophy of Science 32(4): 375–379

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Haim Gaifman.

Additional information

We thank the anonymous referee for a thorough, unusually detailed review of an earlier draft of the work. Trying to address the referee’s comments, we have clarified quite a few passages, elaborated various points and have ended with a better paper.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Gaifman, H., Vasudevan, A. Deceptive updating and minimal information methods. Synthese 187, 147–178 (2012). https://doi.org/10.1007/s11229-011-0028-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11229-011-0028-0

Keywords

Navigation