Skip to main content
Log in

Mixture Density Estimation Based on Maximum Likelihood and Sequential Test Statistics

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

We address the problem of estimating an unknown probability density function from a sequence of input samples. We approximate the input density with a weighted mixture of a finite number of Gaussian kernels whose parameters and weights we estimate iteratively from the input samples using the Maximum Likelihood (ML) procedure. In order to decide on the correct total number of kernels we employ simple statistical tests involving the mean, variance, and the kurtosis, or fourth moment, of a particular kernel. We demonstrate the validity of our method in handling both pattern classification (stationary) and time series (nonstationary) problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. B.D. Ripley, Pattern Recognition and Neural Networks, Cambridge University Press: Cambridge, U.K., 1996.

    Google Scholar 

  2. L. Kaufman and P.J. Rousseeuw, Finding Groups in Data. An Introduction to Cluster Analysis, Wiley: New York, 1990.

    Google Scholar 

  3. M. West and J. Harrison, Bayesian Forecasting and Dynamic Models, Springer-Verlag: New York, 2nd edn, 1997.

    Google Scholar 

  4. D. Titterington, A. Smith and U. Makov, Statistical Analysis of Finite Mixture Distributions, Wiley: New York, 1985.

    Google Scholar 

  5. K. Fu, Sequential Methods in Pattern Recognition and Machine Learning, Academic Press: New York, 1968.

    Google Scholar 

  6. R. Redner and H. Walker, “Mixture densities, maximum likelihood and the EM algorithm”, SIAM Review, Vol. 26, pp. 195–239, Apr. 1984.

    Google Scholar 

  7. N.A. Vlassis, A. Dimopoulos, and G. Papakonstantinou, “The probabilistic growing cell structures algorithm,” in Proc. ICANN'97, 7th Int. Conf. on Artificial Neural Networks, Lausanne, Switzerland, pp. 649–654, Oct. 1997.

  8. D. Specht, “Probabilistic neural networks,” Neural Networks, Vol. 3, pp. 109–118, 1990.

    Article  Google Scholar 

  9. E. Parzen, “On the estimation of a probability density function and mode,” Annals of Mathematical Statistics, Vol. 33, pp. 1065–1076, 1962.

    Google Scholar 

  10. H.G.C. Tråvén, “A neural network approach to statistical pattern classification by ‘semiparametric’ estimation of probability density functions,” IEEE Transactions on Neural Networks, Vol. 2, pp. 366–377, May 1991.

    Google Scholar 

  11. W.-S. Chou and Y.-C. Chen, “A new fast algorithm for effective training of neural classificers,” Pattern Recognition, Vol. 25, No. 4, pp. 423–429, 1992.

    Google Scholar 

  12. R. Streit and T. Lunginbuhl, “Maximum likelihood training of probabilistic neural networks,” IEEE Transactions on Neural Networks, Vol. 5, No. 5, pp. 764–783, 1994.

    Google Scholar 

  13. S. Shimoji, “Self-organizing Neural Networks based on gaussian mixture model for PDF estimation and pattern classification,” PhD thesis, University of Southern California, 1994.

  14. G. McLachlan, “On bootstrapping the likelihood ratio test statistic for the number of components in a normal mixture,” Applied Statistics, Vol. 36, pp. 318–324, 1987.

    Google Scholar 

  15. W. Furman and B. Lindsay, “Testing for the number of components in a mixture of normal distributions using moment estimators,” Computational Statistics & Data Analysis, Vol. 17, pp. 473–492, 1994.

    Google Scholar 

  16. A. Papoulis, “Probability, Random Variables, and Stochastic Processes,” McGraw-Hill, 3rd edn, 1991.

  17. R. von Mises, “Mathematical Theory of Probability and Statistics,” Academic Press: New York, 1964.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Vlassis, N., Papakonstantinou, G. & Tsanakas, P. Mixture Density Estimation Based on Maximum Likelihood and Sequential Test Statistics. Neural Processing Letters 9, 63–76 (1999). https://doi.org/10.1023/A:1018624029058

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1018624029058

Navigation