To read this content please select one of the options below:

Classification confidence weighted majority voting using decision tree classifiers

Norbert Tóth (Department of Measurement and Information Systems, University of Technology and Economics, Budapest, Hungary)
Béla Pataki (Department of Measurement and Information Systems, University of Technology and Economics, Budapest, Hungary)

International Journal of Intelligent Computing and Cybernetics

ISSN: 1756-378X

Article publication date: 6 June 2008

994

Abstract

Purpose

The purpose of this paper is to provide classification confidence value to every individual sample classified by decision trees and use this value to combine the classifiers.

Design/methodology/approach

The proposed system is first theoretically explained, and then the use and effectiveness of the proposed system is demonstrated on sample datasets.

Findings

In this paper, a novel method is proposed to combine decision tree classifiers using calculated classification confidence values. This confidence in the classification is based on distance calculation to the relevant decision boundary (distance conditional), probability density estimation and (distance conditional) classification confidence estimation. It is shown that these values – provided by individual classification trees – can be integrated to derive a consensus decision.

Research limitations/implications

The proposed method is not limited to axis‐parallel trees, it is applicable not only to oblique trees, but also to any kind of classifier system that uses hyperplanes to cluster the input space.

Originality/value

A novel method is presented to extend decision tree like classifiers with confidence calculation and a voting system is proposed that uses this confidence information. The proposed system possesses several novelties (e.g. it not only gives class probabilities, but also classification confidences) and advantages over previous (traditional) approaches. The voting system does not require an auxiliary combiner or gating network, as in the mixture of experts structure and the method is not limited to decision trees with axis‐parallel splits; it is applicable to any kind of classifiers that use hyperplanes to cluster the input space.

Keywords

Citation

Tóth, N. and Pataki, B. (2008), "Classification confidence weighted majority voting using decision tree classifiers", International Journal of Intelligent Computing and Cybernetics, Vol. 1 No. 2, pp. 169-192. https://doi.org/10.1108/17563780810874708

Publisher

:

Emerald Group Publishing Limited

Copyright © 2008, Emerald Group Publishing Limited

Related articles