Skip to main content

Novel Approach to Gentle AdaBoost Algorithm with Linear Weak Classifiers

  • Conference paper
  • First Online:
Book cover Intelligent Information and Database Systems (ACIIDS 2020)

Abstract

This paper presents the problem of calculating the value of the scoring function for weak classifiers operating in the sequential structure. An example of such a structure is Gentle AdaBoost algorithm whose modification we propose in this work. In the proposed approach the distance of the object from the decision boundary is scaled in decision regions defined by the weak classifier at first and later transformed by the log-normal function. The described algorithm was tested on sixth public available data sets and compared with Gentle AdaBoost algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Platt, J., et al.: Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. Adv. Large Margin Classif. 10(3), 61–74 (1999)

    Google Scholar 

  2. Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms, 1st edn. Wiley-Interscience, Hoboken (2004)

    Book  Google Scholar 

  3. Lam, L., Suen, S.: Application of majority voting to pattern recognition: an analysis of its behavior and performance. IEEE Trans. Syst. Man Cybern.-Part A: Syst. Hum. 27(5), 553–568 (1997)

    Article  Google Scholar 

  4. Ruta, D., Gabrys, B.: Classifier selection for majority voting. Inf. Fusion 6(1), 63–81 (2005)

    Article  Google Scholar 

  5. Przybyła-Kasperek, M., Wakulicz-Deja, A.: Dispersed decision-making system with fusion methods from the rank level and the measurement level–a comparative study. Inf. Syst. 69, 124–154 (2017)

    Article  Google Scholar 

  6. Fumera, G., Roli, F.: A theoretical and experimental analysis of linear combiners for multiple classifier systems. IEEE Trans. Pattern Anal. Mach. Intell. 6, 942–956 (2005)

    Article  Google Scholar 

  7. Kittler, J., Alkoot, F.M.: Sum versus vote fusion in multiple classifier systems. IEEE Trans. Pattern Anal. Mach. Intell. 25(1), 110–115 (2003)

    Article  Google Scholar 

  8. Kuncheva, L.I., Bezdek, J.C., Duin, R.P.: Decision templates for multiple classifier fusion: an experimental comparison. Pattern Recogn. 34(2), 299–314 (2001)

    Article  Google Scholar 

  9. Woźniak, M., Graña, M., Corchado, E.: A survey of multiple classifier systems as hybrid systems. Inf. Fusion 16, 3–17 (2014)

    Article  Google Scholar 

  10. Xu, L., Krzyzak, A., Suen, C.Y.: Methods of combining multiple classifiers and their applications to handwriting recognition. IEEE Trans. Syst. Man Cybern. 22(3), 418–435 (1992)

    Article  Google Scholar 

  11. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55(1), 119–139 (1997)

    Article  MathSciNet  Google Scholar 

  12. Burduk, R.: The AdaBoost algorithm with the imprecision determine the weights of the observations. In: Nguyen, N.T., Attachoo, B., Trawiński, B., Somboonviwat, K. (eds.) ACIIDS 2014. LNCS (LNAI), vol. 8398, pp. 110–116. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-05458-2_12

    Chapter  Google Scholar 

  13. Shen, C., Li, H.: On the dual formulation of boosting algorithms. IEEE Trans. Pattern Anal. Mach. Intell. 32(12), 2216–2231 (2010)

    Article  MathSciNet  Google Scholar 

  14. Oza, N.C.: Boosting with averaged weight vectors. In: Windeatt, T., Roli, F. (eds.) MCS 2003. LNCS, vol. 2709, pp. 15–24. Springer, Heidelberg (2003). https://doi.org/10.1007/3-540-44938-8_2

    Chapter  Google Scholar 

  15. Freund, Y., Schapire, R.E., et al.: Experiments with a new boosting algorithm. In: ICML, vol. 96, pp. 148–156. Citeseer (1996)

    Google Scholar 

  16. Wozniak, M.: Proposition of boosting algorithm for probabilistic decision support system. In: Bubak, M., van Albada, G.D., Sloot, P.M.A., Dongarra, J. (eds.) ICCS 2004. LNCS, vol. 3036, pp. 675–678. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-24685-5_117

    Chapter  Google Scholar 

  17. Frejlichowski, D., Gościewska, K., Forczmański, P., Nowosielski, A., Hofman, R.: Applying image features and AdaBoost classification for vehicle detection in the ‘SM4Public’ system. In: Choraś, R.S. (ed.) Image Processing and Communications Challenges 7. AISC, vol. 389, pp. 81–88. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-23814-2_10

    Chapter  Google Scholar 

  18. Graczyk, M., Lasota, T., Trawiński, B., Trawiński, K.: Comparison of bagging, boosting and stacking ensembles applied to real estate appraisal. In: Nguyen, N.T., Le, M.T., Świątek, J. (eds.) ACIIDS 2010. LNCS (LNAI), vol. 5991, pp. 340–350. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-12101-2_35

    Chapter  Google Scholar 

  19. Kozik, R., Choraś, M.: The HTTP content segmentation method combined with AdaBoost classifier for web-layer anomaly detection system. In: Graña, M., López-Guede, J.M., Etxaniz, O., Herrero, Á., Quintián, H., Corchado, E. (eds.) SOCO/CISIS/ICEUTE -2016. AISC, vol. 527, pp. 555–563. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-47364-2_54

    Chapter  Google Scholar 

  20. Wu, S., Nagahashi, H.: Analysis of generalization ability for different AdaBoost variants based on classification and regression trees. J. Electrical Comput. Eng. 2015, 8 (2015)

    Google Scholar 

  21. Burduk, R., Bozejko, W.: Gentle AdaBoost algorithm with score function dependent on the distance to decision boundary. In: Saeed, K., Chaki, R., Janev, V. (eds.) CISIM 2019. LNCS, vol. 11703, pp. 303–310. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-28957-7_25

    Chapter  Google Scholar 

  22. Dmitrienko, A., Chuang-Stein, C., D’Agostino, R.B.: Pharmaceutical statisticsusing SAS: a practical guide. SAS Institute (2007)

    Google Scholar 

  23. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)

    MATH  Google Scholar 

  24. Rejer, I.: Genetic algorithms for feature selection for brain computer interface. Int. J. Pattern Recogn. Artif. Intell. 29(5), 1559008 (2015)

    Article  MathSciNet  Google Scholar 

  25. Szenkovits, A., Meszlényi, R., Buza, K., Gaskó, N., Lung, R.I., Suciu, M.: Feature selection with a genetic algorithm for classification of brain imaging data. In: Stańczyk, U., Zielosko, B., Jain, L.C. (eds.) Advances in Feature Selection for Data and Pattern Recognition. ISRL, vol. 138, pp. 185–202. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-67588-6_10

    Chapter  Google Scholar 

  26. Giełczyk, A., Wawrzyniak, R., Choraś, M.: Evaluation of the existing tools for fake news detection. In: Saeed, K., Chaki, R., Janev, V. (eds.) CISIM 2019. LNCS, vol. 11703, pp. 144–151. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-28957-7_13

    Chapter  Google Scholar 

  27. Topolski, M.: Algorithm of multidimensional analysis of main features of PCA with blurry observation of facility features detection of carcinoma cells multiple myeloma. In: Burduk, R., Kurzynski, M., Wozniak, M. (eds.) CORES 2019. AISC, vol. 977, pp. 286–294. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-19738-4_29

    Chapter  Google Scholar 

Download references

Acknowledgment

This work was supported by the National Science Centre, Poland under the grant no. 2017/25/B/ST6/01750.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Robert Burduk , Wojciech Bożejko or Szymon Zacher .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Burduk, R., Bożejko, W., Zacher, S. (2020). Novel Approach to Gentle AdaBoost Algorithm with Linear Weak Classifiers. In: Nguyen, N., Jearanaitanakij, K., Selamat, A., Trawiński, B., Chittayasothorn, S. (eds) Intelligent Information and Database Systems. ACIIDS 2020. Lecture Notes in Computer Science(), vol 12033. Springer, Cham. https://doi.org/10.1007/978-3-030-41964-6_52

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-41964-6_52

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-41963-9

  • Online ISBN: 978-3-030-41964-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics