Skip to main content
Log in

Synthesis of ranking functions via DNN

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

We propose a new approach to synthesis of non-polynomial ranking functions for loops via deep neural network(DNN). Firstly, we construct a ranking function template by DNN structure. And then the coefficients of the template can be learned by the train-set we construct to get a candidate ranking function, which is transcendental and non-polynomial because of the existence of sigmoid activation function in the neural network. Finally, the candidate ranking function will be verified to show if it is a real ranking function. Most of the existing methods focus on linear or polynomial ranking functions, and are limited to verification tools, while in this paper we make progress regarding the synthesis of non-polynomial ranking functions and new verification method for candidate ranking functions. The experimental results show us that for some of loops from other work, we can find their ranking functions efficiently. Moreover, for some loops having multi-phase ranking functions obtained by existing methods, our method can directly detect their global ranking functions. Especially, our method can also detect the global ranking functions for some loops with transcendental terms, which cannot be dealt with by existing methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Cousot Patrick, Cousot Radhia (2012) An abstract interpretation framework for termination. ACM SIGPLAN Notices 47(1):245–258

    Article  Google Scholar 

  2. Lee Chin Soon, Jones Neil D, Ben-Amram Amir M (2001) The size-change principle for program termination. In Proceedings of the 28th ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages, POPL ’01, page 81-92, New York, NY, USA, Association for Computing Machinery

  3. Caterina Urban (2013) The abstract domain of segmented ranking functions. International static analysis symposium. Springer, NewYork, pp 43–62

    Google Scholar 

  4. Mark Braverman (2006) Termination of integer linear programs. International conference on computer aided verification. Springer, NewYork, pp 372–385

    Google Scholar 

  5. Ashish Tiwari (2004) Termination of linear programs. In: Alur Rajeev, Peled Doron A (eds) Computer aided verification. Springer, Berlin Heidelberg, pp 70–82

    Google Scholar 

  6. Turing AM (1937) On computable numbers, with an application to the Entscheidungsproblem. Proceedings of The London Mathematical Society 41(1):230–265

    Article  MathSciNet  Google Scholar 

  7. Colón Michael, A., Sipma Henny,B. (2002) Practical methods for proving program termination. In (Ed) Brinksma and Kim Guldstrand Larsen, Computer Aided Verification, pp 442–454, Springer, Berlin Heidelberg

  8. Colóon Michael A, Sipma Henny B (2001) Synthesis of linear ranking functions. In International Conference on Tools and Algorithms for the Construction and Analysis of Systems, pages 67–81. Springer, NewYork.

  9. Li Yi, Wu Wenyuan, Feng Yong (2020) On ranking functions for single-path linear-constraint loops. International Journal on Software Tools for Technology Transfer, pages 655–666, 11

  10. Podelski Andreas, Rybalchenko Andrey (2004) A complete method for the synthesis of linear ranking functions. In Bernhard Steffen and Giorgio Levi, editors, International Workshop on Verification, Model Checking, and Abstract Interpretation, pages 239–251, Springer, Berlin Heidelberg

  11. Yuan Yue, Li Yi, Shi Wenchang (2019) Detecting multiphase linear ranking functions for single-path linear-constraint loops. International Journal on Software Tools for Technology Transfer, pages 1–13

  12. Bagnara Roberto, Mesnard Fred (2013) Eventual linear ranking functions. In Proceedings of the 15th Symposium on Principles and Practice of Declarative Programming, PPDP ’13, page 229–238, New York, NY, USA. Association for Computing Machinery

  13. Ben-Amram Amir M, Genaim Samir (2014) Ranking functions for linear-constraint loops. Journal of the Acm 61(4):1–55

    Article  MathSciNet  Google Scholar 

  14. Bradley Aaron R, Zohar Manna, Sipma Henny B (2005) The polyranking principle. International Colloquium on Automata, Languages, and Programming. Springer, NewYork, pp 1349–1361

    Chapter  Google Scholar 

  15. Ben-Amram Amir M, Genaim Samir (2017) On multiphase-linear ranking functions. In International Conference on Computer Aided Verification, pages 601–620. Springer,NewYork

  16. Bradley Aaron R, Manna Zohar, Sipma Henny B (2005) Linear ranking with reachability. In International Conference on Computer Aided Verification, pages 491–504. Springer,NewYork

  17. Leike Jan, Heizmann Matthias (2014) Ranking templates for linear loops. In International Conference on Tools and Algorithms for the Construction and Analysis of Systems, pages 172–186. Springer,NewYork

  18. Li Yi, Zhu Guang, Feng Yong (2016) The l-depth eventual linear ranking functions for single-path linear constraint loops. In 2016 10th International Symposium on Theoretical Aspects of Software Engineering (TASE), pages 30–37. IEEE

  19. Chen Yinghua, Xia Bican, Yang Lu, Zhan Naijun, Zhou Chaochen (2007) Discovering non-linear ranking functions by solving semi-algebraic systems. In International Colloquium on Theoretical Aspects of Computing, pages 34–49. Springer,NewYork

  20. Cousot Patrick (2005) Proving program invariance and termination by parametric abstraction, Lagrangian relaxation and semidefinite programming. In International Workshop on Verification, Model Checking, and Abstract Interpretation, pages 1–24. Springer,NewYork

  21. Shen Liyong, Min Wu, Yang Zhengfeng, Zeng Zhenbing (2013) Generating exact nonlinear ranking functions by symbolic-numeric hybrid method. Journal of Systems Science and Complexity 26(2):291–301

    Article  MathSciNet  Google Scholar 

  22. Yuan Yue, Li Yi (2019) Ranking function detection via svm: A more general method. IEEE Access 7:9971–9979

    Article  Google Scholar 

  23. Fan Rong-En, Chang Kai-Wei, Hsieh Cho-Jui, Wang Xiang-Rui, Lin Chih-Jen (2008) Liblinear: A library for large linear classification. J Mach Learn Res 9:1871–1874

    MATH  Google Scholar 

  24. Li Yi, Sun Xuechao, Li Yong, Turrini Andrea, Zhang Lijun (2019) Synthesizing nested ranking functions for loop programs via svm. In International Conference on Formal Engineering Methods, pages 438–454. Springer, NewYork

  25. de Moura Leonardo, Bjørner Nikolaj (2008) Z3: An efficient smt solver. In C. R. Ramakrishnan and Jakob Rehof, editors, Tools and Algorithms for the Construction and Analysis of Systems, pages 337–340, Springer ,Berlin Heidelberg

  26. Yann Lecun, Yoshua Bengio, Hinton Geoffrey E (2015) Deep learning. Nature 521(7553):436–444

    Article  Google Scholar 

  27. Hornik Kurt, Stinchcombe Maxwell, White Halbert et al (1989) Multilayer feedforward networks are universal approximators. Neural networks 2(5):359–366

    Article  Google Scholar 

  28. Kurt Hornik, Stinchcombe Maxwell B, Halbert White (1990) Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks. Neural Networks 3(5):551–560

    Article  Google Scholar 

  29. Leshno Moshe, Lin Vladimir, Pinkus Allan, Schocken Shimon (1993) Original contribution: Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Networks 6(6):861–867

    Article  Google Scholar 

  30. Abadi Martín, Agarwal Ashish, Barham Paul, Brevdo Eugene, Chen Zhifeng, Citro Craig, Corrado Greg S, Davis Andy, Dean Jeffrey, Devin Matthieu, Ghemawat Sanjay, Goodfellow Ian, Harp Andrew, Irving Geoffrey, Isard Michael, Jia Yangqing, Jozefowicz Rafal, Kaiser Lukasz, Kudlur Manjunath, Levenberg Josh, Mané Dan, Monga Rajat, Moore Sherry, Murray Derek, Olah Chris, Schuster Mike, Shlens Jonathon, Steiner Benoit, Sutskever Ilya, Talwar Kunal, Tucker Paul, Vanhoucke Vincent, Vasudevan Vijay, Viégas Fernanda, Vinyals Oriol, Warden Pete, Wattenberg Martin, Wicke Martin, Yu Yuan, Zheng Xiaoqiang (2015) TensorFlow: Large-scale machine learning on heterogeneous systems. Software available from tensorflow.org

Download references

Acknowledgements

This research is partially supported by the National Natural Science Foundation of China NNSFC(61572024, 11771421,61103110), the Natural Science Foundation of Chongqing (cstc2019jcyj-msxmX0638), Chinese Academy of Sciences “Light of West China” program, Chongqing science and Technology Innovation Guidance Special project(cstc2018jcyj-yszxX0002, cstc2019yszx-jcyjX003), National Key Research and Development Project(2020YFA07123000).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yi Li.

Ethics declarations

Conflicts of interest

The authors declare that no support, financial or otherwise, has been received from any organization that may have an interest in the submitted work and there are no other relationships or activities that could appear to have influenced the submitted work.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tan, W., Li, Y. Synthesis of ranking functions via DNN. Neural Comput & Applic 33, 9939–9959 (2021). https://doi.org/10.1007/s00521-021-05763-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-05763-8

Keywords

Navigation