Inference in hybrid Bayesian networks using mixtures of polynomials

https://doi.org/10.1016/j.ijar.2010.09.003Get rights and content
Under an Elsevier user license
open archive

Abstract

The main goal of this paper is to describe inference in hybrid Bayesian networks (BNs) using mixture of polynomials (MOP) approximations of probability density functions (PDFs). Hybrid BNs contain a mix of discrete, continuous, and conditionally deterministic random variables. The conditionals for continuous variables are typically described by conditional PDFs. A major hurdle in making inference in hybrid BNs is marginalization of continuous variables, which involves integrating combinations of conditional PDFs. In this paper, we suggest the use of MOP approximations of PDFs, which are similar in spirit to using mixtures of truncated exponentials (MTEs) approximations. MOP functions can be easily integrated, and are closed under combination and marginalization. This enables us to propagate MOP potentials in the extended Shenoy–Shafer architecture for inference in hybrid BNs that can include deterministic variables. MOP approximations have several advantages over MTE approximations of PDFs. They are easier to find, even for multi-dimensional conditional PDFs, and are applicable for a larger class of deterministic functions in hybrid BNs.

Keywords

Hybrid Bayesian networks
Inference in hybrid Bayesian networks
Shenoy–Shafer architecture
Extended Shenoy–Shafer architecture
Mixtures of polynomials
Mixtures of truncated exponentials

Cited by (0)