loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Jiri Grim 1 and Pavel Pudil 2

Affiliations: 1 Academy of Sciences of the Czech Republic, Czech Republic ; 2 Prague University of Economics, Czech Republic

Keyword(s): Probabilistic Neural Networks, Product Mixtures, Mixtures of Dependence Trees, EM Algorithm.

Related Ontology Subjects/Areas/Topics: Artificial Intelligence ; Artificial Intelligence and Decision Support Systems ; Biomedical Engineering ; Biomedical Signal Processing ; Computational Intelligence ; Data Manipulation ; Enterprise Information Systems ; Health Engineering and Technology Applications ; Human-Computer Interaction ; Methodologies and Methods ; Modular Implementation of Artificial Neural Networks ; Neural Network Software and Applications ; Neural Networks ; Neurocomputing ; Neurotechnology, Electronics and Informatics ; Pattern Recognition ; Physiological Computing Systems ; Sensor Networks ; Signal Processing ; Soft Computing ; Theory and Methods

Abstract: We compare two probabilistic approaches to neural networks - the first one based on the mixtures of product components and the second one using the mixtures of dependence-tree distributions. The product mixture models can be efficiently estimated from data by means of EM algorithm and have some practically important properties. However, in some cases the simplicity of product components could appear too restrictive and a natural idea is to use a more complex mixture of dependence-tree distributions. By considering the concept of dependence tree we can explicitly describe the statistical relationships between pairs of variables at the level of individual components and therefore the approximation power of the resulting mixture may essentially increase. Nonetheless, in application to classification of numerals we have found that both models perform comparably and the contribution of the dependence-tree structures decreases in the course of EM iterations. Thus the optimal estimate of th e dependence-tree mixture tends to converge to a simple product mixture model. Regardless of computational aspects, the dependence-tree mixtures could help to clarify the role of dendritic branching in the highly selective excitability of neurons. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.224.33.107

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Grim, J. and Pudil, P. (2014). Pattern Recognition by Probabilistic Neural Networks - Mixtures of Product Components versus Mixtures of Dependence Trees. In Proceedings of the International Conference on Neural Computation Theory and Applications (IJCCI 2014) - NCTA; ISBN 978-989-758-054-3, SciTePress, pages 65-75. DOI: 10.5220/0005077500650075

@conference{ncta14,
author={Jiri Grim. and Pavel Pudil.},
title={Pattern Recognition by Probabilistic Neural Networks - Mixtures of Product Components versus Mixtures of Dependence Trees},
booktitle={Proceedings of the International Conference on Neural Computation Theory and Applications (IJCCI 2014) - NCTA},
year={2014},
pages={65-75},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005077500650075},
isbn={978-989-758-054-3},
}

TY - CONF

JO - Proceedings of the International Conference on Neural Computation Theory and Applications (IJCCI 2014) - NCTA
TI - Pattern Recognition by Probabilistic Neural Networks - Mixtures of Product Components versus Mixtures of Dependence Trees
SN - 978-989-758-054-3
AU - Grim, J.
AU - Pudil, P.
PY - 2014
SP - 65
EP - 75
DO - 10.5220/0005077500650075
PB - SciTePress