Skip to main content

DeepDream Algorithm for Data Augmentation in a Neural Network Ensemble Applied to Multiclass Image Classification

  • Conference paper
  • First Online:
Recent Challenges in Intelligent Information and Database Systems (ACIIDS 2022)

Abstract

This paper presents the application of Deep-Dream algorithm for data augmentation applied to images. This algorithm analyzes the image by a trained neural network and hides main features of the image. Importance and place of the features is estimated based on neural network layers output values. The new neural network trained on the processed data are forced to search and train a new set of features in the data, since the known features have been hidden. Trained on original and processed data neural networks are used in an ensemble. Experiments were conducted with a balanced images dataset with 5000 images in 10 classes. Experiment was conducted with a neural network based on InceptionV3 architecture in two variations: with non pretrained weights and with pretrained weights. The neural network received a 256 × 256 pixel image as input. Training was conducted using categorical cross entropy loss function, accuracy metric, Adam optimizer with a learning rate of 0.0001. The improvements of the algorithm are almost insignificant when classifying an ensemble of neural network models for a small number of classes, but the impact of the algorithm increases as the number of classes increases. For binary classification there may be no improvement in ensemble accuracy, but when the number of classes increases and becomes more than 5, the influence of the algorithm on the accuracy of the final ensemble increases. The improvement in ensemble accuracy can be 0.5–4%, depending on the initial training conditions without the use of other types of data processing and augmentation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Luis, P., Jason, W.: The effectiveness of data augmentation in image classification using deep learning

    Google Scholar 

  2. Shorten, C., Khoshgoftaar, T.M.: A survey on image data augmentation for deep learning. J. Big Data 6(1), 1–48 (2019). https://doi.org/10.1186/s40537-019-0197-0

    Article  Google Scholar 

  3. Ilya, K., Denis, Y., Rob, F.: Image augmentation is all you need: regularizing deep reinforcement learning from pixels. arXiv preprint arXiv:2004.13649

  4. Han, D., Liu, Q., Fan, W.: A new image classification method using CNN transfer learning and web data augmentation. Expert Syst. Appl. 95, 43–56 (2018)

    Article  Google Scholar 

  5. Shin, H.-C., et al.: Medical image synthesis for data augmentation and anonymization using generative adversarial networks. In: Gooya, A., Goksel, O., Oguz, I., Burgos, N. (eds.) SASHIMI 2018. LNCS, vol. 11037, pp. 1–11. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00536-8_1

    Chapter  Google Scholar 

  6. Koitka, S., Friedrich, C.M.: Optimized convolutional neural network ensembles for medical subfigure classification. In: Jones, G.J.F., et al. (eds.) CLEF 2017. LNCS, vol. 10456, pp. 57–68. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-65813-1_5

    Chapter  Google Scholar 

  7. Li, H., Wang, X., Ding, S.: Research and development of neural network ensembles: a survey. Artif. Intell. Rev. 49(4), 455–479 (2017). https://doi.org/10.1007/s10462-016-9535-1

    Article  Google Scholar 

  8. William, B., Tim, G., Andreas, N., Jan, K.: The power of ensembles for active learning in image classification. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 9368–9377 (2018)

    Google Scholar 

  9. Mordvintsev, A., Olah, C., Tyka, M.: DeepDream - a code example for visualizing Neural Networks. Google Research (2015)

    Google Scholar 

  10. Alessio, C.: Animals-10 dataset. Kaggle (2019). https://www.kaggle.com/alessiocorrado99/animals10

  11. Christian S., Vincent V., Sergey I., Jonathon S., Wojna Z.: Rethinking the inception architecture for computer vision. arXiv preprint arXiv:1512.00567v3

  12. Gregory, C.: Deep dream. SubStance 45(2), 61–77 (2016)

    Article  Google Scholar 

  13. Alam, K.M.R., Siddique, N., Adeli, H.: A dynamic ensemble learning algorithm for neural networks. Neural Comput. Appl. 32(12), 8675–8690 (2019). https://doi.org/10.1007/s00521-019-04359-7

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dmitrii Viaktin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Viaktin, D., Garcia-Zapirain, B., Mendez Zorrilla, A. (2022). DeepDream Algorithm for Data Augmentation in a Neural Network Ensemble Applied to Multiclass Image Classification. In: Szczerbicki, E., Wojtkiewicz, K., Nguyen, S.V., Pietranik, M., Krótkiewicz, M. (eds) Recent Challenges in Intelligent Information and Database Systems. ACIIDS 2022. Communications in Computer and Information Science, vol 1716. Springer, Singapore. https://doi.org/10.1007/978-981-19-8234-7_51

Download citation

  • DOI: https://doi.org/10.1007/978-981-19-8234-7_51

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-19-8233-0

  • Online ISBN: 978-981-19-8234-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics