Skip to main content
Log in

PAVER 2.0: an open source environment for automated performance analysis of benchmarking data

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

In this paper we describe PAVER 2.0, an environment (i.e. a process and a suite of tools supporting that process) for the automated performance analysis of benchmarking data. This new environment improves on its predecessor by addressing some of the shortcomings of the original PAVER (Bussieck et al. in Global optimization and constraint satisfaction, lecture notes in computer science, vol 2861, pp 223–238. Springer, Berlin, 2003. doi:10.1007/978-3-540-39901-8_17) and extending its capabilities. The changes serve to further the original goals of PAVER (automation of the visualization and summarization of benchmarking data) while making the environment more accessible for the use of and modification by the entire community of potential users. In particular, we have targeted the end-users of optimization software, as they are best able to make the many subjective choices necessary to produce impactful results when benchmarking optimization software. We illustrate with some sample analyses conducted via PAVER 2.0.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Notes

  1. http://www.gamsworld.org/performance.

  2. http://www.gamsworld.org/performance/paver.

  3. http://pandas.pydata.org.

  4. While arithmetic means are sensitive to variations of data with relatively large range and insensitive to variations of data with relatively small range, geometric means are more sensitive to variations close to zero. As a compromise, PAVER can also compute shifted geometric means [1], which reduce the effect of data points close to zero in the geometric mean by shifting.

  5. PAVER uses a gap tolerance of \(10^{-6}\) by default. However, we have run our solvers with a zero gap tolerance.

    Fig. 1
    figure 1

    Performance profile for solving time of 5 MIP solvers and the corresponding virtually best and worst solvers

    Fig. 2
    figure 2

    Visualization of some performance metrics: number of instances where a solver failed, shifted geometric mean of solving time, average ratio of a solver’s solving time to Gimli’s solving time, number of instances solved to optimality, number of instances where an optimal solution has been found, and average primal gap at termination

  6. Note that our globallib.solu file contains known optimal values for only half of the GlobalLib instances, so the number of global optimal solutions found and the mean primal gap are computed w.r.t. these 214 instances only.

References

  1. Achterberg, T.: Constraint Integer Programming. Ph.D. thesis, TU Berlin (2007). http://nbnresolving.de/urn:nbn:de:0297-zib-11129

  2. Achterberg, T.: Benchmarking a MIP Solver. Talk in CPAIOR Master Class (2010). http://cpaior2010.ing.unibo.it/?q=node/10

  3. Berthold, T.: Measuring the impact of primal heuristics. Oper. Res. Lett. 41(6), 611–614 (2013). doi:10.1016/j.orl.2013.08.007

  4. Billups, S.C., Dirkse, S.P., Ferris, M.C.: A comparison of large scale mixed complementarity problem solvers. Comput. Optim. Appl. 7(1), 3–25 (1997). doi:10.1023/A:1008632215341

    Article  Google Scholar 

  5. Bussieck, M.R., Drud, A.S., Meeraus, A.: MINLPLib—a collection of test models for mixed-integer nonlinear programming. INFORMS J. Comput. 15(1), 114–119 (2003). doi:10.1287/ijoc.15.1.114.15159

    Article  Google Scholar 

  6. Bussieck, M.R., Drud, A.S., Meeraus, A., Pruessner, A.: Quality assurance and global optimization. In: Bliek, C., Jermann, C., Neumaier, A. (eds.) Global Optimization and Constraint Satisfaction, Lecture Notes in Computer Science, vol. 2861, pp. 223–238. Springer, Berlin (2003). doi:10.1007/978-3-540-39901-8_17

  7. Crowder, H., Dembo, R.S., Mulvey, J.M.: On reporting computational experiments with mathematical software. ACM Trans. Math. Softw. 5(2), 193–203 (1979). doi:10.1145/355826.355833

    Article  Google Scholar 

  8. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002). doi:10.1007/s101070100263

    Article  Google Scholar 

  9. Dolan, E.D., Moré, J.J., Munson, T.S.: Benchmarking Optimization Software with COPS 3.0. Tech. Rep. ANL/MCS-273, Mathematics and Computer Science Division, Argonne National Laboratory (2004). http://www.mcs.anl.gov/more/cops

  10. Dolan, E.D., Moré, J.J., Munson, T.S.: Optimality measures for performance profiles. SIAM J. Optim. 16(3), 891–909 (2006). doi:10.1137/040608015

    Article  Google Scholar 

  11. Drud, A.S.: Testing and Tuning a New Solver Version Using Performance Tests. INFORMS 2002, San Jose, Session on ’Benchmarking and performance testing of optimization software’. http://www.gams.com/presentations/present_performance.pdf. Accessed 15 May 2013

  12. Exler, O., Lehmann, T., Schittkowski, K.: A comparative study of SQP-type algorithms for nonlinear and nonconvex mixed-integer optimization. Math. Program. Comput. 4(4), 383–412 (2012). doi:10.1007/s12532-012-0045-0

    Article  Google Scholar 

  13. GAMS Development: GAMS/Examiner, User’s Manual (2013). http://www.gams.com/dd/docs/solvers/examiner.pdf. Accessed 8 May 2013

  14. Granlund, T.: The GMP Development Team: GNU MP: The GNU Multiple Precision Arithmetic Library (2012). http://gmplib.org

  15. Hendel, G.: PyEvalGui - GUI Components to Facilitate Evaluation of SCIP and Other Solving Software (2013, in development)

  16. Koch, T., Achterberg, T., Andersen, E., Bastert, O., Berthold, T., Bixby, R.E., Danna, E., Gamrath, G., Gleixner, A.M., Heinz, S., Lodi, A., Mittelmann, H., Ralphs, T., Salvagnin, D., Steffy, D.E., Wolter, K.: MIPLIB 2010—mixed integer programming library version 5. Math. Program. Comput. 3(2), 103–163 (2011). doi:10.1007/s12532-011-0025-9

    Article  Google Scholar 

  17. Mahajan, A., Leyffer, S., Kirches, C.: Solving Mixed-Integer Nonlinear Programs by QP-Diving. Preprint ANL/MCS-P2071-0312, Argonne National Laboratory (2012). http://www.optimization-online.org/DB_HTML/2012/03/3409.html

  18. Meeraus, A.: Globallib (2013). http://www.gamsworld.org/global/globallib.htm. Accessed 8 May 2013

  19. Mittelmann, H.D.: An independent benchmarking of SDP and SOCP solvers. Math. Program. 95(2), 407–430 (2003). doi:10.1007/s10107-002-0355-5

    Article  Google Scholar 

  20. Mittelmann, H.D.: DTOS—a service for the optimization community. SIAG/OPT Views-and-News 18, 17–20 (2007)

    Google Scholar 

  21. Mittelmann, H.D.: Decision Tree for Optimization Software (2013). http://plato.asu.edu/guide.html. Accessed 8 May 2013

  22. Mittelmann, H.D., Pruessner, A.: A server for automated performance analysis of benchmarking data. Optim. Methods Softw. 21(1), 105–120 (2006). doi:10.1080/10556780500065366

    Article  Google Scholar 

  23. SCIP Development Team: How to Run Automated Tests with SCIP. http://scip.zib.de/doc/html/TEST.shtml

  24. Why open source? (2013). http://www.coin-or.org. Accessed 15 May 2013

  25. Wikiquote: Winston churchill—wikiquote (2013). http://en.wikiquote.org/w/index.php?title=Winston_Churchill&oldid=1552921. Accessed 8 May 2013

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Steven P. Dirkse.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Bussieck, M.R., Dirkse, S.P. & Vigerske, S. PAVER 2.0: an open source environment for automated performance analysis of benchmarking data. J Glob Optim 59, 259–275 (2014). https://doi.org/10.1007/s10898-013-0131-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-013-0131-5

Keywords

Navigation