ABSTRACT
No single machine learning algorithm is most accurate for all problems due to the effect of an algorithm's inductive bias. Research has shown that a combination of experts of the same type, referred to as a mixture of homogeneous experts, can increase the accuracy of ensembles by reducing the adverse effect of an algorithm's inductive bias. However, the predictive power of a mixture of homogeneous experts is still limited by the inductive bias of the algorithm that makes up the mixture. For this reason, combinations of different machine learning algorithms, referred to as a mixture of heterogeneous experts, has been proposed to take advantage of the strengths of different machine learning algorithms and to reduce the adverse effects of the inductive biases of these algorithms. This paper presents a mixture of heterogeneous experts, and evaluates its performance to that of a number of mixtures of homogeneous experts on a set of classification problems. The results indicate that a mixture of heterogeneous experts aggregates the advantages of experts, increasing the accuracy of predictions. The mixture of heterogeneous experts not only outperformed all homogeneous ensembles on two of the datasets, but also achieved the best overall accuracy rank across the various datasets.
- A.L. Coelho and D.S. Nascimento. 2010. On the evolutionary design of heterogeneous bagging models. Neurocomputing 73, 16-18 (2010), 3319--3322.Google ScholarDigital Library
- J. Derrac, S. García, D. Molina, and F. Herrera. 2011. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation 1, 1 (2011), 3--18.Google Scholar
- L.K. Hansen and P. Salamon P. 1990. Neural network ensembles. IEEE Transactions on Pattern Analysis & Machine Intelligence 12, 10 (1990), 993--1001.Google ScholarDigital Library
- K-W. Hsu. 2012. Hybrid ensembles of decision trees and artificial neural networks. In Proceedings of the IEEE International Conference on Computational Intelligence and Cybernetics. 25--29.Google ScholarCross Ref
- K-W. Hsu. 2017. A theoretical analysis of why hybrid ensembles work. Computational Intelligence and Neuroscience (2017), 1--12.Google Scholar
- T.M. Mitchell. 1980. The need for biases in learning generalizations. Technical Report CBM-TR-117. Rutgers University.Google Scholar
- D. Opitz and R. Maclin. 1999. Popular ensemble methods: An empirical study. Journal of Artificial Intelligence Research 11 (1999), 169--198.Google ScholarDigital Library
- C-F. Tsai, Y-C. Lin, D.C. Yen, and Y-M. Chen. 2011. Predicting stock returns by classifier ensembles. Applied Soft Computing 11, 2 (2011), 2452--2459.Google ScholarDigital Library
Index Terms
- Mixtures of Heterogeneous Experts
Recommendations
Incremental learning by heterogeneous bagging ensemble
ADMA'10: Proceedings of the 6th international conference on Advanced data mining and applications - Volume Part IIClassifier ensemble is a main direction of incremental learning researches, and many ensemble-based incremental learning methods have been presented. Among them, Learn++, which is derived from the famous ensemble algorithm, AdaBoost, is special. Learn++ ...
Multistrategy Ensemble Learning: Reducing Error by Combining Ensemble Learning Techniques
Ensemble learning strategies, especially Boosting and Bagging decision trees, have demonstrated impressive capacities to improve the prediction accuracy of base learning algorithms. Further gains have been demonstrated by strategies that combine simple ...
An ensemble learning framework for convolutional neural network based on multiple classifiers
AbstractTraditional machine learning methods have certain limitations in constructing high-precision estimation models and improving generalization ability, but ensemble learning that combines multiple different single models into one model is ...
Comments