Abstract
Probabilistic Decision-Based Neural Networks (PDBNNs) can be considered as a special form of Gaussian Mixture Models (GMMs) with trainable decision thresholds. This paper provides detailed illustrations to compare the recognition accuracy and decision boundaries of PDBNNs with that of GMMs through two pattern recognition tasks, namely the noisy XOR problem and the classification of two-dimensional vowel data. The paper highlights the strengths of PDBNNs by demonstrating that their thresholding mechanism is very effective in detecting data not belonging to any known classes. The original PDBNNs use elliptical basis functions with diagonal covariance matrices, which may be inappropriate for modelling feature vectors with correlated components. This paper overcomes this limitation by using full covariance matrices, and showing that the matrices are effective in characterising non-spherical clusters.
Similar content being viewed by others
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Yiu, K., Mak, M. & Li, C. Gaussian Mixture Models and Probabilistic Decision-Based Neural Networks for Pattern Classification: A Comparative Study. Neural Comput & Applic 8, 235–245 (1999). https://doi.org/10.1007/s005210050026
Published:
Issue Date:
DOI: https://doi.org/10.1007/s005210050026