Abstract
We address the problem of estimating an unknown probability density function from a sequence of input samples. We approximate the input density with a weighted mixture of a finite number of Gaussian kernels whose parameters and weights we estimate iteratively from the input samples using the Maximum Likelihood (ML) procedure. In order to decide on the correct total number of kernels we employ simple statistical tests involving the mean, variance, and the kurtosis, or fourth moment, of a particular kernel. We demonstrate the validity of our method in handling both pattern classification (stationary) and time series (nonstationary) problems.
Similar content being viewed by others
References
B.D. Ripley, Pattern Recognition and Neural Networks, Cambridge University Press: Cambridge, U.K., 1996.
L. Kaufman and P.J. Rousseeuw, Finding Groups in Data. An Introduction to Cluster Analysis, Wiley: New York, 1990.
M. West and J. Harrison, Bayesian Forecasting and Dynamic Models, Springer-Verlag: New York, 2nd edn, 1997.
D. Titterington, A. Smith and U. Makov, Statistical Analysis of Finite Mixture Distributions, Wiley: New York, 1985.
K. Fu, Sequential Methods in Pattern Recognition and Machine Learning, Academic Press: New York, 1968.
R. Redner and H. Walker, “Mixture densities, maximum likelihood and the EM algorithm”, SIAM Review, Vol. 26, pp. 195–239, Apr. 1984.
N.A. Vlassis, A. Dimopoulos, and G. Papakonstantinou, “The probabilistic growing cell structures algorithm,” in Proc. ICANN'97, 7th Int. Conf. on Artificial Neural Networks, Lausanne, Switzerland, pp. 649–654, Oct. 1997.
D. Specht, “Probabilistic neural networks,” Neural Networks, Vol. 3, pp. 109–118, 1990.
E. Parzen, “On the estimation of a probability density function and mode,” Annals of Mathematical Statistics, Vol. 33, pp. 1065–1076, 1962.
H.G.C. Tråvén, “A neural network approach to statistical pattern classification by ‘semiparametric’ estimation of probability density functions,” IEEE Transactions on Neural Networks, Vol. 2, pp. 366–377, May 1991.
W.-S. Chou and Y.-C. Chen, “A new fast algorithm for effective training of neural classificers,” Pattern Recognition, Vol. 25, No. 4, pp. 423–429, 1992.
R. Streit and T. Lunginbuhl, “Maximum likelihood training of probabilistic neural networks,” IEEE Transactions on Neural Networks, Vol. 5, No. 5, pp. 764–783, 1994.
S. Shimoji, “Self-organizing Neural Networks based on gaussian mixture model for PDF estimation and pattern classification,” PhD thesis, University of Southern California, 1994.
G. McLachlan, “On bootstrapping the likelihood ratio test statistic for the number of components in a normal mixture,” Applied Statistics, Vol. 36, pp. 318–324, 1987.
W. Furman and B. Lindsay, “Testing for the number of components in a mixture of normal distributions using moment estimators,” Computational Statistics & Data Analysis, Vol. 17, pp. 473–492, 1994.
A. Papoulis, “Probability, Random Variables, and Stochastic Processes,” McGraw-Hill, 3rd edn, 1991.
R. von Mises, “Mathematical Theory of Probability and Statistics,” Academic Press: New York, 1964.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Vlassis, N., Papakonstantinou, G. & Tsanakas, P. Mixture Density Estimation Based on Maximum Likelihood and Sequential Test Statistics. Neural Processing Letters 9, 63–76 (1999). https://doi.org/10.1023/A:1018624029058
Issue Date:
DOI: https://doi.org/10.1023/A:1018624029058