Abstract
Belief revision is the problem of finding the most plausible explanation for an observed set of evidences. This has many applications in various scientific domains like natural language understanding, medical diagnosis and computational biology. Bayesian Networks (BN) is an important probabilistic graphical formalism used widely for belief revision tasks. In BN, belief revision can be achieved by setting the values of all random variables such that their joint probability is maximized. This assignment is called the maximum a posteriori (MAP) assignment. Finding MAP is an NP-Hard problem. In this paper, we are proposing finding the MAP assignment in BN using High Order Recurrent Neural Networks through an intermediate representation of Cost-Based Abduction. This method will eliminate the need to explicitly construct the energy function in two steps, objective and constraints, which will decrease the number of free parameters to set.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
McCarthy, J., Hayes, P.J.: Some Philosophical Problems from the Standpoint of Artificial Intelligence. In: Meltzer, B., Michie, D. (eds.) Machine Intelligence, vol. 4, pp. 463–502. Edinburgh University Press (1969)
Pearl, J.: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, San Francisco (1988)
Abdelbar, A.M.: Designing high order recurrent networks for Bayesian belief revision. In: Medsker, L.R., Jain, L.C. (eds.) Recurrent Neural Networks: Design and Applications, pp. 77–98. CRC Press, Boca Raton (1999)
Shimony, S.E.: Finding MAPs for belief networks is NP-hard. Artificial Intelligence 68, 399–410 (1994)
Abdelbar, A.M., Hedetniemi, S.M.: Approximating MAPs for belief networks is NP-hard and other theorems. Artificial Intelligence 102, 21–38 (1998)
Abdelbar, A.M.: An algorithm for finding MAPs for belief networks through cost-based abduction. Artificial Intelligence 104, 331–338 (1998)
Abdelbar, A.M., Andrews, E.A.M., Wunsch II, D.C.: Abductive Reasoning with Recurrent Neural Networks. Neural Networks 16(5-6), 665–673 (2003)
Abdelbar, A.M., El-Hemely, M.A., Andrews, E.A.M., Wunsch II, D.C.: Recurrent Neural Networks with Backtrack-Points and Negative Reinforcement Applied to Cost-Based Abduction. Neural Networks 18(5-6), 755–764 (2005)
Charniak, E., Shimony, S.E.: Probabilistic semantics for cost-based abduction. In: AAAI National Conference on Artificial Intelligence, pp. 106–111. AAAI, Boston (1990)
Pinkas, G.: Reasoning, nonmonotonicity and learning in connectionist networks that capture prepositional knowledge. Artificial Intelligence 77, 203–347 (1995)
Charniak, E., Shimony, S.E.: Cost-based abduction and MAP explanation. Artificial Intelligence 66, 345–374 (1994)
Den, Y.: Generalized Chart Algorithm: An Efficient Procedure for Cost-Based Abduction. In: 32nd annual Meeting of the Association for Computational Linguistics, pp. 218–225. New Mexico Association for Computational Linguistics, Las Cruces (1994)
Ishizuka, M., Matsuo, Y.: SL Method for Computing a Nearoptimal Solution Using Linear and Non-Linear Programming in Cost-Based Hypothetical Reasoning. Knowledge-based systems 15, 369–376 (2002)
Santos, E.: A Linear constraint satisfaction approach to Cost-Based Abduction. Artificial Intelligence 65(1), 1–27 (1994)
Santos, E.J., Santos, E.S.: Polynomial Solvability of Cost-Based Abduction. Artificial Intelligence 86, 157–170 (1996)
Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. National Academy of Science 79, 2554–2558 (1982)
Murphy, K.: A Brief Introduction to Graphical Models and Bayesian Networks, http://www.cs.ubc.ca/~murphyk/Bayes/bnintro.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Andrews, E.A.M., Bonner, A.J. (2009). Finding MAPs Using High Order Recurrent Networks. In: Leung, C.S., Lee, M., Chan, J.H. (eds) Neural Information Processing. ICONIP 2009. Lecture Notes in Computer Science, vol 5863. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10677-4_11
Download citation
DOI: https://doi.org/10.1007/978-3-642-10677-4_11
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-10676-7
Online ISBN: 978-3-642-10677-4
eBook Packages: Computer ScienceComputer Science (R0)