Skip to main content
Log in

Gaussian Mixture Models and Probabilistic Decision-Based Neural Networks for Pattern Classification: A Comparative Study

  • Original Article
  • Published:
Neural Computing & Applications Aims and scope Submit manuscript

Abstract

Probabilistic Decision-Based Neural Networks (PDBNNs) can be considered as a special form of Gaussian Mixture Models (GMMs) with trainable decision thresholds. This paper provides detailed illustrations to compare the recognition accuracy and decision boundaries of PDBNNs with that of GMMs through two pattern recognition tasks, namely the noisy XOR problem and the classification of two-dimensional vowel data. The paper highlights the strengths of PDBNNs by demonstrating that their thresholding mechanism is very effective in detecting data not belonging to any known classes. The original PDBNNs use elliptical basis functions with diagonal covariance matrices, which may be inappropriate for modelling feature vectors with correlated components. This paper overcomes this limitation by using full covariance matrices, and showing that the matrices are effective in characterising non-spherical clusters.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yiu, K., Mak, M. & Li, C. Gaussian Mixture Models and Probabilistic Decision-Based Neural Networks for Pattern Classification: A Comparative Study. Neural Comput & Applic 8, 235–245 (1999). https://doi.org/10.1007/s005210050026

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s005210050026

Navigation