Skip to main content

Feature Fusion for Improved Classification: Combining Dempster-Shafer Theory and Multiple CNN Architectures

  • Conference paper
  • First Online:
Computational Collective Intelligence (ICCCI 2024)

Abstract

Addressing uncertainty in Deep Learning (DL) is essential, as it enables the development of models that can make reliable predictions and informed decisions in complex, real-world environments where data may be incomplete or ambiguous. This paper introduces a novel algorithm leveraging Dempster-Shafer Theory (DST) to integrate multiple pre-trained models to form an ensemble capable of providing more reliable and enhanced classifications. The main steps of the proposed method include feature extraction, mass function calculation, fusion, and expected utility calculation. Several experiments have been conducted on CIFAR-10 and CIFAR-100 datasets, demonstrating superior classification accuracy of the proposed DST-based method, achieving improvements of 5.4% and 8.4%, respectively, compared to the best individual pre-trained models. Results highlight the potential of DST as a robust framework for managing uncertainties related to data when applying DL in real-world scenarios.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Boulila, W., Ayadi, Z., Farah, I.R.: Sensitivity analysis approach to model epistemic and aleatory imperfection: application to land cover change prediction model. J. Comput. Sci. 23, 58–70 (2017)

    Article  Google Scholar 

  2. Chen, X., et al.: Comprehensive evaluation of dam seepage safety combining deep learning with Dempster-Shafer evidence theory. Measurement 226, 114172 (2024). https://doi.org/10.1016/j.measurement.2024.114172

    Article  Google Scholar 

  3. Dempster, A.P.: A generalization of Bayesian inference. J. Roy. Stat. Soc.: Ser. B (Methodol.) 30(2), 205–232 (1968)

    Article  MathSciNet  Google Scholar 

  4. Denœux, T.: Logistic regression, neural networks and Dempster-Shafer theory: a new perspective. Knowl.-Based Syst. 176, 54–67 (2019)

    Article  Google Scholar 

  5. Ferchichi, A., Boulila, W., Farah, I.R.: Propagating aleatory and epistemic uncertainty in land cover change prediction process. Ecol. Inform. 37, 24–37 (2017)

    Article  Google Scholar 

  6. Ferchichi, A., Boulila, W., Farah, I.R.: Reducing uncertainties in land cover change models using sensitivity analysis. Knowl. Inf. Syst. 55, 719–740 (2018)

    Article  Google Scholar 

  7. Fidon, L., et al.: A Dempster-Shafer approach to trustworthy AI with application to fetal brain MRI segmentation. IEEE Trans. Pattern Anal. Mach. Intell., 1–12 (2024). https://doi.org/10.1109/TPAMI.2023.3346330

  8. Griffiths, T.L., Zhu, J.Q., Grant, E., McCoy, R.T.: Bayes in the age of intelligent machines (2023)

    Google Scholar 

  9. Li, M.Y., Grant, E., Griffiths, T.L.: Gaussian process surrogate models for neural networks. In: Uncertainty in Artificial Intelligence, pp. 1241–1252. PMLR (2023)

    Google Scholar 

  10. Luo, H., Zhou, Q., Li, Z., Deng, Y.: Variational quantum linear solver-based combination rules in Dempster-Shafer theory. Inf. Fusion 102, 102070 (2024). https://doi.org/10.1016/j.inffus.2023.102070

    Article  Google Scholar 

  11. Marwah, G.P.K., et al.: An improved machine learning model with hybrid technique in VANET for robust communication. Mathematics 10(21) (2022). https://doi.org/10.3390/math10214030

  12. Peñafiel, S., Baloian, N., Sanson, H., Pino, J.A.: Applying Dempster-Shafer theory for developing a flexible, accurate and interpretable classifier. Expert Syst. Appl. 148, 113262 (2020)

    Article  Google Scholar 

  13. Qiu, W., Ma, Y., Chen, X., Yu, H., Chen, L.: Hybrid intrusion detection system based on Dempster-Shafer evidence theory. Comput. Secur. 117, 102709 (2022)

    Article  Google Scholar 

  14. Shafer, G.: Dempster-Shafer theory. Encyclopedia Artif. Intell. 1, 330–331 (1992)

    Google Scholar 

  15. Tian, Z., et al.: Deep learning and Dempster-Shafer theory based insider threat detection. Mobile Netw. Appl., 1–10 (2020)

    Google Scholar 

  16. Tong, Z., Xu, P., Denoeux, T.: An evidential classifier based on Dempster-Shafer theory and deep learning. Neurocomputing 450, 275–293 (2021)

    Article  Google Scholar 

  17. Tong, Z., Xu, P., Denœux, T.: An evidential classifier based on Dempster-Shafer theory and deep learning. Neurocomputing 450, 275–293 (2021). https://doi.org/10.1016/j.neucom.2021.03.066

    Article  Google Scholar 

  18. Varone, G., et al.: Finger pinching and imagination classification: a fusion of CNN architectures for IoMT-enabled BCI applications. Inf. Fusion 101, 102006 (2024)

    Article  Google Scholar 

  19. Xue, P., Fei, L., Ding, W.: A volunteer allocation optimization model in response to major natural disasters based on improved Dempster-Shafer theory. Expert Syst. Appl. 236, 121285 (2024). https://doi.org/10.1016/j.eswa.2023.121285

    Article  Google Scholar 

  20. Yaghoubi, V., Cheng, L., Van Paepegem, W., Kersemans, M.: CNN-DST: ensemble deep learning based on Dempster-Shafer theory for vibration-based fault recognition. Struct. Health Monit. 21(5), 2063–2082 (2022)

    Article  Google Scholar 

  21. Zheng, Y., Li, G., Zhang, W., Li, Y., Wei, B.: Feature selection with ensemble learning based on improved Dempster-Shafer evidence fusion. IEEE Access 7, 9032–9045 (2019)

    Article  Google Scholar 

  22. Zhu, C., Qin, B., Xiao, F., Cao, Z., Pandey, H.M.: A fuzzy preference-based Dempster-Shafer evidence theory for decision fusion. Inf. Sci. 570, 306–322 (2021). https://doi.org/10.1016/j.ins.2021.04.059

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgment

The authors would like to acknowledge the support of Prince Sultan University for paying the fees of this publication.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ayyub Alzahem .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Alzahem, A., Boulila, W., Driss, M., Koubaa, A. (2024). Feature Fusion for Improved Classification: Combining Dempster-Shafer Theory and Multiple CNN Architectures. In: Nguyen, N.T., et al. Computational Collective Intelligence. ICCCI 2024. Lecture Notes in Computer Science(), vol 14811. Springer, Cham. https://doi.org/10.1007/978-3-031-70819-0_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-70819-0_22

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-70818-3

  • Online ISBN: 978-3-031-70819-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics