Abstract
The mixture-of-experts (ME) models can be useful to solve complicated classification problems in real world. However, in order to train the ME model with not only labeled data but also unlabeled data which are easier to come, a new learning algorithm that considers characteristics of the ME model is required. We proposed global-local co-training (GLCT), the hybrid training method of the ME model training method for supervised learning (SL) and the co-training, which trains the ME model in semi-supervised learning (SSL) manner. GLCT uses a global model and a local model together since using the local model only shows low accuracy due to lack of labeled training data. The models enlarge the labeled data set from the unlabeled one and are trained from it by supplementing each other. To evaluate the method, we performed experiments using benchmark data sets from UCI machine learning repository. As the result, GLCT confirmed the feasibility of itself. Moreover, a comparison experiments to show the excellences of GLCT showed better performance than the other alternative method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Jacobs, R., Jordan, M., Nowlan, S., Hinton, G.: Adaptive mixture local experts. Neural Computation 3(4), 79–87 (1991)
Duda, R.O., Hart, P.E.: Pattern Classification and Scene Analysis. Wiley, Chichester (1973)
Wang, W., Zhou, Z.-H.: Analyzing co-training style algorithms. In: Kok, J.N., Koronacki, J., Lopez de Mantaras, R., Matwin, S., Mladenič, D., Skowron, A. (eds.) ECML 2007. LNCS (LNAI), vol. 4701, pp. 454–465. Springer, Heidelberg (2007)
Blum, A., Mitchell, T.: Combining labeled and unlabeled data with co-training. In: Proc. of the 11th Annual Conf. on Computational Learning Theory, pp. 92–100 (1998)
Kiritchenko, S., Matwin, S.: Email classification with co-training. In: Proc. of the 2001 Conf. of the Centre for Advanced Studies on Collaborative Research, pp. 1–10 (2001)
Fang, Y., Cheng, J., Wang, J., Wang, K., Liu, J., Lu, H.: Hand posture recognition with co-training. In: Proc. of Int’l Conf. on Pattern Recognition, p. 104 (2008)
Nigam, K., Mccallum, A.K., Thrun, S., Mitchell, T.: Text classification from labeled and unlabeled documents using EM. Machine Learning 39(2–3), 103–134 (2000)
Nigam, K., Ghani, R.: Analyzing the effectiveness and applicability of co-training. In: Proc. of the 9th Int’l Conf. on Information and Knowledge Management, pp. 86–93 (2000)
Goldman, S., Zhou, Y.: Enhancing supervised learning with unlabeled data. In: Proc. of the 17th Int’l Conf. on Machine Learning, pp. 327–334 (2000)
Ubeyli, E.D.: Wavelet/mixture of experts network structure for EEG signals classification. Expert Systems with Applications 34(3), 1954–1962 (2008)
Ebrahimpour, R., Kabir, E., Esteky, H., Yousefi, M.R.: View-independent face recognition with mixture of experts. Neurocomputing 71(4–6), 1103–1107 (2008)
Ebrahimpour, R., Kabir, E., Yousefi, M.R.: Face detection using mixture of MLP experts. Neural Processing Letters 26(1), 69–82 (2007)
Lima, C.A.M., Coelho, A.L.V., Zuben, F.J.V.: Hybridizing mixtures of experts with support vector machines: Investigation into nonlinear dynamic systems identification. Information Sciences 177(10), 2049–2074 (2007)
Ko, A.H.R., Sabourin, R., Britto Jr., A.S.: From dynamic classifier selection to dynamic ensemble selection. Pattern Recognition 41(5), 1718–1731 (2008)
Karakoulas, G., Salakhutdinov. R.: Semi-supervised mixture-of-experts classification. In: Proc. of 4th IEEE Int’l Conf. on Data Mining, pp. 138–145 (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yoon, JW., Cho, SB. (2011). Global/Local Hybrid Learning of Mixture-of-Experts from Labeled and Unlabeled Data. In: Corchado, E., Kurzyński, M., Woźniak, M. (eds) Hybrid Artificial Intelligent Systems. HAIS 2011. Lecture Notes in Computer Science(), vol 6678. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21219-2_57
Download citation
DOI: https://doi.org/10.1007/978-3-642-21219-2_57
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21218-5
Online ISBN: 978-3-642-21219-2
eBook Packages: Computer ScienceComputer Science (R0)