Skip to main content

Markov Limid processes for representing and solving renewal problems

  • Published:
Annals of Operations Research Aims and scope Submit manuscript

Abstract

In this paper a new tool for simultaneous optimisation of decisions on multiple time scales is presented. The tool combines the dynamic properties of Markov decision processes with the flexible and compact state space representation of LImited Memory Influence Diagrams (Limids). A temporal version of Limids, TemLimids, is defined by adding time-related functions to utility nodes. As a result, expected discounted utility, as well as expected relative utility might be used as optimisation criteria in TemLimids. Optimisation proceeds as in ordinary Limids. A sequence of such TemLimids can be used to model a Markov Limid Process, where each TemLimid represents a macro action. Algorithms are presented to find optimal plans for a sequence of such macro actions. Use of algorithms is illustrated based on an extended version of an example from pig production originally used to introduce the Limid concept.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. http://www.prodstyr.ihh.kvl.dk/software/mlhmp.html.

  2. http://www.esthauge.dk.

References

  • Chagunda, M. G. G., Friggens, N. C., Rasmussen, M. D., & Larsen, T. (2006). A model for detection of individual cow mastitis based on an indicator measured in milk. Journal of Dairy Science, 89(8), 2980–2998.

    Article  Google Scholar 

  • Houben, E., Huirne, R., Dijkhuizen, A., & Kristensen, A. R. (1994). Optimal replacement of mastitis cows determined by a hierarchic Markov process. Journal of Dairy Science, 77, 2975–2993.

    Article  Google Scholar 

  • Howard, R., & Matheson, J. (1984). Influence diagrams. In Readings in the principles and applications of decision analysis. Menlo Park: Strategic Decision Group.

    Google Scholar 

  • Jensen, F., Jensen, F. V., & Dittmer, S. L. (1994). From influence diagrams to junction trees. In R. L. de Mantaras & D. Poole (Eds.), Proceedings of the 10th conference on uncertainty in artificial intelligence (pp. 367–373). San Francisco: Morgan Kaufmann.

    Google Scholar 

  • Kaelbling, L. P., Littman, M. L., & Cassandra, A. R. (1998). Planning and acting in partially observable stochastic domains. Artificial Intelligence, 101(1–2), 99–134.

    Article  Google Scholar 

  • Kristensen, A. R. (1988). Hierarchic Markov processes and their applications in replacement models. European Journal of Operational Research, 35, 207–215.

    Article  Google Scholar 

  • Kristensen, A. R. (1993). Bayesian updating in hierarchic Markov processes applied to the animal replacement problem. European Review of Agricultural Economics, 20, 223–239.

    Article  Google Scholar 

  • Kristensen, A. R. (2003). A general software system for Markov decision processes in herd management applications. Computers and Electronics in Agriculture, 38, 199–215.

    Article  Google Scholar 

  • Kristensen, A. R., & Jørgensen, E. (2000). Multi-level hierarchic Markov processes as a framework for herd management support. Annals of Operations Research, 94, 69–89.

    Article  Google Scholar 

  • Kristensen, A. R., & Søllested, T. A. (2004). A sow replacement model using Bayesian updating in a three-level hierarchic Markov process: II. Optimization model. Livestock Production Science, 87(1), 25–36.

    Article  Google Scholar 

  • Lauritzen, S. L. (2003). Some modern applications of graphical models. In P. Green, N. Hjort, & S. Richardson (Eds.), Highly structured stochastic systems (pp. 13–30). Oxford: Oxford University Press.

    Google Scholar 

  • Lauritzen, S. L., & Nilsson, D. (2001). Representing and solving decision problems with limited information. Management Science, 47, 1235–1251.

    Article  Google Scholar 

  • Lovejoy, W. S. (1991). A survey of algorithmic methods for partially observed Markov decision processes. Annals of Operations Research, 28(1–4), 47–66.

    Article  Google Scholar 

  • Madsen, A. L., & Nilsson, D. (2001). Solving influence diagrams using HUGIN, Shafer–Shenoy, and lazy propagation. In Proceedings of the seventeenth conference on uncertainty in artificial intelligence (UAI-2000) (pp. 337–345). San Francisco: Morgan Kaufmann.

    Google Scholar 

  • Madsen, T. N., & Kristensen, A. R. (2005). A model for monitoring the condition of young pigs by their drinking behaviour. Computers and Electronics in Agriculture, 48(2), 138–154.

    Article  Google Scholar 

  • de Mol, R., & Ouweltjes, W. (2001). Detection model for mastitis in cows milked in an automatic milking system. Preventive Veterinary Medicine, 49, 71–82.

    Article  Google Scholar 

  • Mourits, M., Huirne, R., Dijkhuizen, A., Kristensen, A. R., & Galligan, D. (1999). Economic optimization of dairy heifer management decisions. Agricultural systems, 61, 17–31.

    Article  Google Scholar 

  • Nielsen, L. R., Jørgensen, E., & Højsgaard, S. (2011). Embedding a state space model into a Markov decision process. Annals of Operations Research, 190(1), 289–309. doi:10.1007/s10479-010-0688-z.

    Article  Google Scholar 

  • Nilsson, D., & Höhle, M. (2005). Methods for evaluating decision problems with limited information. Ludwig Maximillians Universität München Collaborative Research Center, 386. Discussion paper 421 (pp. 1–20).

  • Nilsson, D., & Lauritzen, S. (2000). Evaluating influence diagrams using Limids. In Proceedings of the sixteenth conference on uncertainty in artificial intelligence (UAI 2000). San Francisco: Morgan Kaufmann.

    Google Scholar 

  • Olmsted, S. (1983). On representing and solving decision problems. PhD thesis, Stanford University.

  • Puterman, M. L. (1994). Markov decision processes. New York: Wiley.

    Book  Google Scholar 

  • Shachter, R. (1986). Evaluating influence diagrams. Operations Research, 34, 871–882.

    Article  Google Scholar 

  • Shenoy, P. P. (1992). Valuation-based systems for Bayesian decision analysis. Operations Research, 40, 463–484.

    Article  Google Scholar 

  • Tatman, J., & Shachter, R. (1990). Dynamic programming and influence diagrams. IEEE Transactions on Systems, Man, and Cybernetics, 20(2), 365–379.

    Article  Google Scholar 

  • Verstegen, J. A., Sonnemans, J., Huirne, R. B., Dijkhuizen, A. A., & Cox, J. C. (1998). Quantifying the effects of sow-herd management information systems on farmers’ decision making using experimental economics. American Journal of Agricultural Economics, 80, 821.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Erik Jørgensen.

Additional information

This research was carried out as part of Dina, Danish Informatics Network in the Agricultural Sciences.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Jørgensen, E., Kristensen, A.R. & Nilsson, D. Markov Limid processes for representing and solving renewal problems. Ann Oper Res 219, 63–84 (2014). https://doi.org/10.1007/s10479-012-1220-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10479-012-1220-4

Keywords