Abstract
In this paper a new tool for simultaneous optimisation of decisions on multiple time scales is presented. The tool combines the dynamic properties of Markov decision processes with the flexible and compact state space representation of LImited Memory Influence Diagrams (Limids). A temporal version of Limids, TemLimids, is defined by adding time-related functions to utility nodes. As a result, expected discounted utility, as well as expected relative utility might be used as optimisation criteria in TemLimids. Optimisation proceeds as in ordinary Limids. A sequence of such TemLimids can be used to model a Markov Limid Process, where each TemLimid represents a macro action. Algorithms are presented to find optimal plans for a sequence of such macro actions. Use of algorithms is illustrated based on an extended version of an example from pig production originally used to introduce the Limid concept.
Similar content being viewed by others
References
Chagunda, M. G. G., Friggens, N. C., Rasmussen, M. D., & Larsen, T. (2006). A model for detection of individual cow mastitis based on an indicator measured in milk. Journal of Dairy Science, 89(8), 2980–2998.
Houben, E., Huirne, R., Dijkhuizen, A., & Kristensen, A. R. (1994). Optimal replacement of mastitis cows determined by a hierarchic Markov process. Journal of Dairy Science, 77, 2975–2993.
Howard, R., & Matheson, J. (1984). Influence diagrams. In Readings in the principles and applications of decision analysis. Menlo Park: Strategic Decision Group.
Jensen, F., Jensen, F. V., & Dittmer, S. L. (1994). From influence diagrams to junction trees. In R. L. de Mantaras & D. Poole (Eds.), Proceedings of the 10th conference on uncertainty in artificial intelligence (pp. 367–373). San Francisco: Morgan Kaufmann.
Kaelbling, L. P., Littman, M. L., & Cassandra, A. R. (1998). Planning and acting in partially observable stochastic domains. Artificial Intelligence, 101(1–2), 99–134.
Kristensen, A. R. (1988). Hierarchic Markov processes and their applications in replacement models. European Journal of Operational Research, 35, 207–215.
Kristensen, A. R. (1993). Bayesian updating in hierarchic Markov processes applied to the animal replacement problem. European Review of Agricultural Economics, 20, 223–239.
Kristensen, A. R. (2003). A general software system for Markov decision processes in herd management applications. Computers and Electronics in Agriculture, 38, 199–215.
Kristensen, A. R., & Jørgensen, E. (2000). Multi-level hierarchic Markov processes as a framework for herd management support. Annals of Operations Research, 94, 69–89.
Kristensen, A. R., & Søllested, T. A. (2004). A sow replacement model using Bayesian updating in a three-level hierarchic Markov process: II. Optimization model. Livestock Production Science, 87(1), 25–36.
Lauritzen, S. L. (2003). Some modern applications of graphical models. In P. Green, N. Hjort, & S. Richardson (Eds.), Highly structured stochastic systems (pp. 13–30). Oxford: Oxford University Press.
Lauritzen, S. L., & Nilsson, D. (2001). Representing and solving decision problems with limited information. Management Science, 47, 1235–1251.
Lovejoy, W. S. (1991). A survey of algorithmic methods for partially observed Markov decision processes. Annals of Operations Research, 28(1–4), 47–66.
Madsen, A. L., & Nilsson, D. (2001). Solving influence diagrams using HUGIN, Shafer–Shenoy, and lazy propagation. In Proceedings of the seventeenth conference on uncertainty in artificial intelligence (UAI-2000) (pp. 337–345). San Francisco: Morgan Kaufmann.
Madsen, T. N., & Kristensen, A. R. (2005). A model for monitoring the condition of young pigs by their drinking behaviour. Computers and Electronics in Agriculture, 48(2), 138–154.
de Mol, R., & Ouweltjes, W. (2001). Detection model for mastitis in cows milked in an automatic milking system. Preventive Veterinary Medicine, 49, 71–82.
Mourits, M., Huirne, R., Dijkhuizen, A., Kristensen, A. R., & Galligan, D. (1999). Economic optimization of dairy heifer management decisions. Agricultural systems, 61, 17–31.
Nielsen, L. R., Jørgensen, E., & Højsgaard, S. (2011). Embedding a state space model into a Markov decision process. Annals of Operations Research, 190(1), 289–309. doi:10.1007/s10479-010-0688-z.
Nilsson, D., & Höhle, M. (2005). Methods for evaluating decision problems with limited information. Ludwig Maximillians Universität München Collaborative Research Center, 386. Discussion paper 421 (pp. 1–20).
Nilsson, D., & Lauritzen, S. (2000). Evaluating influence diagrams using Limids. In Proceedings of the sixteenth conference on uncertainty in artificial intelligence (UAI 2000). San Francisco: Morgan Kaufmann.
Olmsted, S. (1983). On representing and solving decision problems. PhD thesis, Stanford University.
Puterman, M. L. (1994). Markov decision processes. New York: Wiley.
Shachter, R. (1986). Evaluating influence diagrams. Operations Research, 34, 871–882.
Shenoy, P. P. (1992). Valuation-based systems for Bayesian decision analysis. Operations Research, 40, 463–484.
Tatman, J., & Shachter, R. (1990). Dynamic programming and influence diagrams. IEEE Transactions on Systems, Man, and Cybernetics, 20(2), 365–379.
Verstegen, J. A., Sonnemans, J., Huirne, R. B., Dijkhuizen, A. A., & Cox, J. C. (1998). Quantifying the effects of sow-herd management information systems on farmers’ decision making using experimental economics. American Journal of Agricultural Economics, 80, 821.
Author information
Authors and Affiliations
Corresponding author
Additional information
This research was carried out as part of Dina, Danish Informatics Network in the Agricultural Sciences.
Rights and permissions
About this article
Cite this article
Jørgensen, E., Kristensen, A.R. & Nilsson, D. Markov Limid processes for representing and solving renewal problems. Ann Oper Res 219, 63–84 (2014). https://doi.org/10.1007/s10479-012-1220-4
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10479-012-1220-4