Skip to main content

Neural Network Subgraphs Correlation with Trained Model Accuracy

  • Conference paper
  • First Online:
Artificial Intelligence and Soft Computing (ICAISC 2020)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 12415))

Included in the following conference series:

  • 2072 Accesses

Abstract

Neural Architecture Search (NAS) is a computationally demanding process of finding optimal neural network architecture for a given task. Conceptually, NAS comprises applying a search strategy on a predefined search space accompanied by a performance evaluation method. The design of search space alone is expected to substantially impact NAS efficiency. We consider neural networks as graphs and find a correlation between the presence of subgraphs and the network’s final test accuracy by analyzing a dataset of convolutional neural networks trained for image recognition. We also consider a subgraph based network distance measure and suggest opportunities for improved NAS algorithms that could benefit from our observations.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Aric, A., Hagberg, D.A.S., Swart, P.J.: Exploring network structure, dynamics, and function using networkx. In: Gäel Varoquaux, T.V., Millman, J. (eds.) Proceedings of the 7th Python in Science Conference (SciPy2008), 9–15, June 2019, Pasadena, California, USA, pp. 11–15 (2008)

    Google Scholar 

  2. Bunke, H., Shearer, K.: A graph distance metric based on the maximal common subgraph. Pattern Recognit. Lett. 19(3–4), 255–259 (1998). https://doi.org/10.1016/S0167-8655(97)00179-7

    Article  MATH  Google Scholar 

  3. Dong, X., Yang, Y.: Nas-bench-201: extending the scope of reproducible neural architecture search. CoRR abs/2001.00326 (2020). http://arxiv.org/abs/2001.00326

  4. Elsken, T., Metzen, J.H., Hutter, F.: Neural architecture search: a survey. J. Mach. Learn. Res. 20, 55:1–55:21 (2019)

    MathSciNet  MATH  Google Scholar 

  5. Glasser, G.J., Winter, R.F.: Critical values of the coefficient of rank correlation for testing the hypothesis of independence. Biometrika 48(3/4), 444–448 (1961)

    Article  Google Scholar 

  6. Ha, H., Rana, S., Gupta, S., Nguyen, T., Tran-The, H., Venkatesh, S.: Bayesian optimization with unknown search space. In: Wallach, H., Larochelle, H., Beygelzimer, A., d’Alché-Buc, F., Fox, E., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 32, pp. 11795–11804. Curran Associates, Inc (2019)

    Google Scholar 

  7. Hutter, F., Kotthoff, L., Vanschoren, J. (eds.): Automated Machine Learning. TSSCML. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-05318-5

  8. Jin, H., Song, Q., Hu, X.: Auto-keras: an efficient neural architecture search system. In: Teredesai, A., Kumar, V., Li, Y., Rosales, R., Terzi, E., Karypis, G. (eds.) Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery And Data Mining, KDD 2019, Anchorage, AK, USA, 4–8, August 2019, pp. 1946–1956. ACM (2019). https://doi.org/10.1145/3292500.3330648

  9. Kandasamy, K., Neiswanger, W., Schneider, J., Poczos, B., Xing, E.P.: Neural architecture search with bayesian optimisation and optimal transport. In: Bengio, S., Wallach, H., Larochelle, H., Grauman, K., Cesa-Bianchi, N., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 31, pp. 2016–2025. Curran Associates, Inc (2018)

    Google Scholar 

  10. Krizhevsky, A.: Learning multiple layers of features from tiny images. Technical report, Department of Computer Science, U. of Toronto (2009)

    Google Scholar 

  11. Li, L., Talwalkar, A.: Random search and reproducibility for neural architecture search. In: Globerson, A., Silva, R. (eds.) Proceedings of the Thirty-Fifth Conference on Uncertainty in Artificial Intelligence, UAI 2019, Tel Aviv, Israel, 22–25, July 2019, p. 129. AUAI Press (2019)

    Google Scholar 

  12. Lindauer, M., Hutter, F.: Best practices for scientific research on neural architecture search. CoRR abs/1909.02453 (2019). http://arxiv.org/abs/1909.02453

  13. Nýdl, V.: Graph reconstruction from subgraphs. Discret. Math. 235(1–3), 335–341 (2001). https://doi.org/10.1016/S0012-365X(00)00287-9

    Article  MathSciNet  MATH  Google Scholar 

  14. Oh, C., Tomczak, J.M., Gavves, E., Welling, M.: Combinatorial bayesian optimization using the graph cartesian product. In: Wallach, H.M., Larochelle, H., Beygelzimer, A., d’Alché-Buc, F., Fox, E.B., Garnett, R. (eds.) Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, 8–14 December 2019, pp. 2910–2920. Canada, Vancouver, BC (2019)

    Google Scholar 

  15. Radosavovic, I., Johnson, J., Xie, S., Lo, W., Dollár, P.: On network design spaces for visual recognition. CoRR abs/1905.13214 (2019). http://arxiv.org/abs/1905.13214

  16. Sanfeliu, A., Fu, K.: A distance measure between attributed relational graphs for pattern recognition. IEEE Trans. Syst. Man Cybern. 13(3), 353–362 (1983). https://doi.org/10.1109/TSMC.1983.6313167

    Article  MATH  Google Scholar 

  17. Shahriari, B., Bouchard-Côté, A., de Freitas, N.: Unbounded bayesian optimization via regularization. In: Gretton, A., Robert, C.C. (eds.) Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016, Cadiz, Spain, 9–11, May 2016. JMLR Workshop and Conference Proceedings JMLR.org, vol. 51, pp. 1168–1176 (2016)

    Google Scholar 

  18. Spearman, C.: The proof and measurement of association between two things. Am. J. Psychol. 15(1), 72–101 (1904)

    Article  Google Scholar 

  19. Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., Wojna, Z.: Rethinking the inception architecture for computer vision. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2016

    Google Scholar 

  20. Wistuba, M., Rawat, A., Pedapati, T.: A survey on neural architecture search. CoRR abs/1905.01392 (2019). http://arxiv.org/abs/1905.01392

  21. Yang, A., Esperança, P.M., Carlucci, F.M.: NAS evaluation is frustratingly hard. In: 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, 26–30, April 2020. OpenReview.net (2020)

    Google Scholar 

  22. Ying, C.: Enumerating unique computational graphs via an iterative graph invariant. CoRR abs/1902.06192 (2019). http://arxiv.org/abs/1902.06192

  23. Ying, C., Klein, A., Christiansen, E., Real, E., Murphy, K., Hutter, F.: Nas-bench-101: towards reproducible neural architecture search. In: Chaudhuri, K., Salakhutdinov, R. (eds.) Proceedings of the 36th International Conference on Machine Learning, ICML 2019, 9–15 June 2019, Long Beach, California, USA. Proceedings of Machine Learning Research PMLR, vol. 97, pp. 7105–7114 (2019)

    Google Scholar 

  24. Zoph, B., Vasudevan, V., Shlens, J., Le, Q.V.: Learning transferable architectures for scalable image recognition. In: 2018 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2018, Salt Lake City, UT, USA, 18–22, June 2018, pp. 8697–8710. IEEE Computer Society (2018). https://doi.org/10.1109/CVPR.2018.00907

Download references

Acknowledgments

The author would like to express heartfelt gratitude to prof. Dariusz Dereniowski, Faculty of Electronics, Telecommunications and Informatics, Gdańsk University of Technology, Poland, for multiple discussions and ideas which greatly supported the presented work. Utmost gratitude is directed towards Marek M Landowski from Intel, Data Platforms Group, for supporting the presented research and providing valuable feedback. The author has been partially supported under ministry subsidy for research for Gdansk University of Technology.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Izajasz Wrosz .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wrosz, I. (2020). Neural Network Subgraphs Correlation with Trained Model Accuracy. In: Rutkowski, L., Scherer, R., Korytkowski, M., Pedrycz, W., Tadeusiewicz, R., Zurada, J.M. (eds) Artificial Intelligence and Soft Computing. ICAISC 2020. Lecture Notes in Computer Science(), vol 12415. Springer, Cham. https://doi.org/10.1007/978-3-030-61401-0_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-61401-0_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-61400-3

  • Online ISBN: 978-3-030-61401-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics