Skip to main content

Compositional Committees of Tiny Networks

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2021)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1517))

Included in the following conference series:

  • 1830 Accesses

Abstract

Deep neural networks tend to be accurate but computationally expensive, whereas ensembles tend to be fast but do not capitalize on hierarchical representations. This paper proposes an approach that attempts to combine the advantages of both approaches. Hierarchical ensembles represent an effort in this direction, however they are not compositional in a representational sense, since they only combine classifier decisions and/or outputs. We propose to take this effort one step further in the form of compositional ensembles, which exploit the composition of the hidden representations of classifiers, here defined as tiny networks on account of being neural networks of significantly limited scale. As such, our particular instance of compositional ensembles is called Compositional Committee of Tiny Networks (CoCoTiNe). We experimented with different CoCoTiNe variants involving different types of composition, input usage, and ensemble decisions. The best variant demonstrated that CoCoTiNe is more accurate than standard hierarchical committees, and is relatable to the accuracy of vanilla Convolutional Neural Networks, whilst being 25.7 times faster in a standard CPU setup. In conclusion, the paper demonstrates that compositional ensembles, especially in the context of tiny networks, are a viable and efficient approach for combining the advantages of deep networks and ensembles.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Deng, L.: The MNIST database of handwritten digit images for machine learning research. IEEE Signal Process. Mag. 29(6), 141–142 (2012)

    Article  Google Scholar 

  2. Ganaie, M., Hu, M., et al.: Ensemble deep learning: a review. arXiv preprint arXiv:2104.02395 (2021)

  3. Jain, R., Duppada, V., Hiray, S.: Seernet@ INLI-FIRE-2017: hierarchical ensemble for Indian native language identification. In: FIRE (Working Notes), pp. 127–129 (2017)

    Google Scholar 

  4. Kim, B.-K., Roh, J., Dong, S.-Y., Lee, S.-Y.: Hierarchical committee of deep convolutional neural networks for robust facial expression recognition. J. Multimodal User Interfaces 10(2), 173–189 (2016). https://doi.org/10.1007/s12193-015-0209-0

    Article  Google Scholar 

  5. Kim, K., Lin, H., Choi, J.Y., Choi, K.: A design framework for hierarchical ensemble of multiple feature extractors and multiple classifiers. Pattern Recognit. 52, 1–16 (2016)

    Article  Google Scholar 

  6. Krizhevsky, A., Hinton, G., et al.: Learning multiple layers of features from tiny images (2009)

    Google Scholar 

  7. Liang, X., Ma, Y., Xu, M.: THU-HCSI at SemEval-2019 task 3: hierarchical ensemble classification of contextual emotion in conversation. In: Proceedings of the 13th International Workshop on Semantic Evaluation, pp. 345–349 (2019)

    Google Scholar 

  8. Marndi, A., Patra, G.K., Gouda, K.C.: Short-term forecasting of wind speed using time division ensemble of hierarchical deep neural networks. Bull. Atmos. Sci. Technol. 1(1), 91–108 (2020). https://doi.org/10.1007/s42865-020-00009-2

    Article  Google Scholar 

  9. Singh, R.K., Gorantla, R.: Dmenet: diabetic macular edema diagnosis using hierarchical ensemble of CNNs. Plos One 15(2), e0220677 (2020)

    Article  Google Scholar 

  10. Sudderth, E., Freeman, W.: Hierarchical ensemble of global and local classifiers for face recognition. IEEE Signal Process. Mag. 25(2), 114–141 (2008)

    Article  Google Scholar 

  11. Tolstikhin, I., et al.: MLP-Mixer: an all-MLP architecture for vision. arXiv preprint arXiv:2105.01601 (2021)

  12. Valentini, G.: Hierarchical ensemble methods for protein function prediction. International Scholarly Research Notices 2014 (2014)

    Google Scholar 

  13. Wang, H., Li, J., He, K.: Hierarchical ensemble reduction and learning for resource-constrained computing. ACM Trans. Des. Autom. Electron. Syst. (TODAES) 25(1), 1–21 (2019)

    Google Scholar 

  14. Wang, R., Li, H., Lan, R., Luo, S., Luo, X.: Hierarchical ensemble learning for Alzheimer’s disease classification. In: 2018 7th International Conference on Digital Home (ICDH), pp. 224–229. IEEE (2018)

    Google Scholar 

  15. Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747 (2017)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tomas Maul .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Seng, G.H., Maul, T., Kapadnis, M.N. (2021). Compositional Committees of Tiny Networks. In: Mantoro, T., Lee, M., Ayu, M.A., Wong, K.W., Hidayanto, A.N. (eds) Neural Information Processing. ICONIP 2021. Communications in Computer and Information Science, vol 1517. Springer, Cham. https://doi.org/10.1007/978-3-030-92310-5_45

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-92310-5_45

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-92309-9

  • Online ISBN: 978-3-030-92310-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics