Abstract.
This paper considers the relative entropy between the conditional distribution and an incorrectly initialized filter for the estimation of one component of a Markov process given observations of the second component. Using the Markov property, we first establish a decomposition of the relative entropy between the measures on observation path space associated to different initial conditions. Using this decomposition, it is shown that the relative entropy of the optimal filter relative to an incorrectly initialized filter is a positive supermartingale. By applying the decomposition to signals observed in additive, white noise, a relative entropy bound is obtained on the integrated, expected, mean square difference between the optimal and incorrectly initialized estimates of the observation function.
Similar content being viewed by others
Author information
Authors and Affiliations
Additional information
Date received: October 6, 1997. Date revised: April 9, 1999.
Rights and permissions
About this article
Cite this article
Clark, J., Ocone, D. & Coumarbatch, C. Relative Entropy and Error Bounds for Filtering of Markov Processes. Math. Control Signals Systems 12, 346–360 (1999). https://doi.org/10.1007/PL00009856
Issue Date:
DOI: https://doi.org/10.1007/PL00009856