Abstract:
We consider the problem of estimating the Probability Mass Function (PMF) of a discrete random vector (RV) from partial observations, namely when some elements in each ob...Show MoreMetadata
Abstract:
We consider the problem of estimating the Probability Mass Function (PMF) of a discrete random vector (RV) from partial observations, namely when some elements in each observed realization may be missing. Since the PMF takes the form of a multi-way tensor, under certain model assumptions the problem becomes closely associated with tensor factorization. Indeed, in recent studies it was shown that a low-rank PMF tensor can be fully recovered (under some mild conditions) by applying a low-rank (approximate) joint factorization to all estimated joint PMFs of subsets of fixed cardinality larger than two (e.g., triplets). The joint factorization is based on a Least Squares (LS) fit to the estimated lower-order sub-tensors. In this letter we take a different estimation approach by fitting the partial factorization directly to the observed partial data in the sense of Kullback-Leibler divergence (KLD). Consequently, we avoid the need for particular selection and direct estimation of sub-tensors of a particular order, as we inherently apply proper weighting to all the available partial data. We show that our approach essentially attains the Maximum Likelihood estimate of the full PMF tensor (under the low-rank model) and therefore enjoys its well-known properties of consistency and asymptotic efficiency. In addition, based on the Bayesian model interpretation of the low-rank model, we propose an Estimation-Maximization (EM) based approach, which is computationally cheap per iteration. Simulation results demonstrate the advantages of our proposed KLD-based hybrid approach (combining alternating-directions minimization with EM) over LS fitting of sub-tensors.
Published in: IEEE Signal Processing Letters ( Volume: 26, Issue: 10, October 2019)