Abstract
State-of-the-art optimization algorithms often expose many parameters that should be configured to improve empirical performance. Manually tuning of such parameters is synonymous with tedious experiments which tend to lead to unsatisfactory outcomes. Accordingly, researchers developed several frameworks to tune the parameters of a given algorithm over a class of problems. Until very recently, however, these approaches are not testified and applied to many-objective algorithms. This study formulates a many-objective algorithm configuration (MAC) method which is available for the Matlab and Python. In MAC, we take into account the importance of a given configuration by building a conditional probability graph. In this light, the introduced algorithm aims to explore more important variables using an undirected fully-connected graph. Experimental results reveal that MAC performs better in comparison with state-of-the-art F-Race and SMAC frameworks.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bergstra, J.S., Bardenet, R., Bengio, Y., Kégl, B.: Algorithms for hyper-parameter optimization. In: Advances in Neural Information Processing Systems, pp. 2546–2554 (2011)
Birattari, M., Yuan, Z., Balaprakash, P., Stützle, T.: F-Race and iterated F-Race: an overview. In: Bartz-Beielstein, T., Chiarandini, M., Paquete, L., Preuss, M. (eds.) Experimental Methods for the Analysis of Optimization Algorithms, pp. 311–336. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-02538-9_13
Blot, A., Hoos, H.H., Jourdan, L., Kessaci-Marmion, M.É., Trautmann, H.: MO-ParamILS: a multi-objective automatic algorithm configuration framework. In: Festa, P., Sellmann, M., Vanschoren, J. (eds.) LION 2016. LNCS, vol. 10079, pp. 32–47. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-50349-3_3
Chen, J., Xin, B., Peng, Z., Dou, L., Zhang, J.: Optimal contraction theorem for exploration-exploitation tradeoff in search and optimization. IEEE Trans. Syst. Man Cybern.-Part A: Syst. Hum. 39(3), 680–691 (2009)
Cheng, R., Jin, Y., Olhofer, M., Sendhoff, B.: A reference vector guided evolutionary algorithm for many-objective optimization. IEEE Trans. Evol. Comput. 20(5), 773–791 (2016)
Cheng, R., et al.: Benchmark functions for the CEC’2018 competition on many-objective optimization. Technical report, University of Birmingham, United Kingdom (2018)
Coello, C.A.C., Pulido, G.T., Lechuga, M.S.: Handling multiple objectives with particle swarm optimization. IEEE Trans. Evol. Comput. 8(3), 256–279 (2004)
Deb, K., Jain, H.: An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part i: solving problems with box constraints. IEEE Trans. Evol. Comput. 18(4), 577–601 (2014)
Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)
Durillo, J.J., Nebro, A.J.: jmetal: a Java framework for multi-objective optimization. Adv. Eng. Softw. 42(10), 760–771 (2011)
Golovin, D., Solnik, B., Moitra, S., Kochanski, G., Karro, J., Sculley, D.: Google vizier: a service for black-box optimization. In: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 1487–1495. ACM (2017)
Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-25566-3_40
Ilievski, I., Akhtar, T., Feng, J., Shoemaker, C.A.: Efficient hyperparameter optimization for deep learning algorithms using deterministic RBF surrogates. In: AAAI, pp. 822–829 (2017)
Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)
Olsson, A., Sandberg, G., Dahlblom, O.: On latin hypercube sampling for structural reliability analysis. Struct. Saf. 25(1), 47–68 (2003)
Park, J., Sandberg, I.W.: Universal approximation using radial-basis-function networks. Neural Comput. 3(2), 246–257 (1991)
Regis, R.G., Shoemaker, C.A.: A stochastic radial basis function method for the global optimization of expensive functions. INFORMS J. Comput. 19(4), 497–509 (2007)
Roffo, G., Melzi, S.: Features selection via eigenvector centrality. In: Proceedings of New Frontiers in Mining Complex Patterns (NFMCP 2016), October 2016 (2016)
Roffo, G., Melzi, S., Castellani, U., Vinciarelli, A.: Infinite latent feature selection: a probabilistic latent graph-based ranking approach. In: Computer Vision and Pattern Recognition (2017)
Sierra, M.R., Coello Coello, C.A.: Improving PSO-based multi-objective optimization using crowding, mutation and \(\in \)-Dominance. In: Coello, C.A.C., Hernández Aguirre, A., Zitzler, E. (eds.) EMO 2005. LNCS, vol. 3410, pp. 505–519. Springer, Heidelberg (2005). https://doi.org/10.1007/978-3-540-31880-4_35
Snoek, J., Larochelle, H., Adams, R.P.: Practical Bayesian optimization of machine learning algorithms. In: Advances in Neural Information Processing Systems, pp. 2951–2959 (2012)
Snoek, J., et al.: Scalable Bayesian optimization using deep neural networks. In: International Conference on Machine Learning, pp. 2171–2180 (2015)
Tian, Y., Cheng, R., Zhang, X., Jin, Y.: PlatEMO: a MATLAB platform for evolutionary multi-objective optimization [educational forum]. IEEE Comput. Intell. Mag. 12(4), 73–87 (2017)
Acknowledgments
This research was supported in part through computational resources provided by Mésocentre of Strasbourg.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Rakhshani, H., Idoumghar, L., Lepagnot, J., Brévilliers, M. (2019). MAC: Many-objective Automatic Algorithm Configuration. In: Deb, K., et al. Evolutionary Multi-Criterion Optimization. EMO 2019. Lecture Notes in Computer Science(), vol 11411. Springer, Cham. https://doi.org/10.1007/978-3-030-12598-1_20
Download citation
DOI: https://doi.org/10.1007/978-3-030-12598-1_20
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-12597-4
Online ISBN: 978-3-030-12598-1
eBook Packages: Computer ScienceComputer Science (R0)