Abstract
A genetic algorithm heuristic that uses multiple rank indicators taken from a number of well established evolutionary algorithms including NSGA-II, IBEA and SPEA2 is developed. It is named Multi-Indicator GA (MIGA). At every iteration, MIGA uses one among the available indicators to select the individuals which will participate as parents in the next iteration. MIGA chooses the indicators according to predefined probabilities found through the analysis of mixture experiments. Mixture experiments are a particular type of experimental design suitable for the calibration of parameters that represent probabilities. Their main output is an explanatory model of algorithm performance as a function of its parameters. By finding the point that provides the maximum we also find good algorithm parameters. To the best of our knowledge, this is the first paper where mixture experiments are used for heuristic tuning. The design of mixture experiments approach allowed the authors to identify and exploit synergy between the different rank indicators. This is demonstrated by our experimental results in which the tuned MIGA compares favorably to other well established algorithms, an uncalibrated multi-indicator algorithm, and a multi-indicator algorithm calibrated using a more conventional approach.
Similar content being viewed by others
References
Adenso-Díaz, B., Laguna, M.: Fine-tuning of algorithms using fractional experimental designs and local search. Oper. Res. 54, 99–114 (2006)
Aickelin, U., Li, J.: An estimation of distribution algorithm for nurse scheduling. Ann. Oper. Res. 155(1), 289–309 (2007)
Bai, R., Burke, E.K., Kendall, G.: Heuristic, meta-heuristic and hyper-heuristic approaches for fresh produce inventory control and shelf space allocation. J. Oper. Res. Soc. 59, 1387–1397 (2008)
Bartz-Beielstein, T.: Experimental Research in Evolutionary Computation. Natural Computing Series. Springer, Berlin (2006)
Birattari, M., Stützle, T., Paquete, L., Varrentrapp, K.: A racing algorithm for configuring metaheuristics. In: Langdon, W. (ed.) GECCO 2002: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 11–18 (2002)
Burke, E., Hart, E., Kendall, G., Newall, J., Ross, P., Schulenburg, S.: Hyper-heuristics: an emerging direction in modern search technology. In: Glover, F., Kochenberger, G. (eds.) Handbook of Metaheuristics, pp. 457–474. Springer, Berlin (2003a)
Burke, E.D., Silva, J.D.L., Soubeiga, E.: Hyperheuristic approaches for multiobjective optimisation. In: Proceedings of the Fifth Metaheuristics International Conference (MIC 2003), pp. 11.1–11.6, Kyoto, Japan (2003b)
Burke, E.K., McCollum, B., Meisels, A., Petrovic, S., Qu, R.: A graph-based hyper-heuristic for educational timetabling problems. Eur. J. Oper. Res. 176(1), 177–192 (2007)
Chasalow, S., Brand, R.: Algorithm AS 299: generation of simplex lattice points. Appl. Stat. 44(4), 534–545 (1995)
Chiarandini, B.M., Goegebeur, Y.: Mixed models for the analysis of optimization algorithms. In: Bartz-Beielstein, T., Chiarandini, M., Paquete, L., Preuss, M. (eds.) Experimental Methods for the Analysis of Optimization Algorithms, pp. 225–264. Springer, Berlin (2010)
Conover, W.J.: Practical Nonparametric Statistics, 3rd edn. Wiley Series in Probability and Statistics. Wiley, New York (1999)
Cornell, J.: Experiments with Mixtures: Design, Models and the Analysis of Mixture Data, 3rd edn. Wiley Series in Probability and Statistics. Wiley, New York (2002)
Coy, S., Golden, B.L., Runger, G.C., Wasil, E.A.: Using experimental design to find effective parameter settings for heuristics. J. Heuristics 7(1), 77–97 (2001)
Deb, K., Agrawal, R.B.: Real-coded genetic algorithms with simulated binary crossover: studies on multi-modal and multi-objective problems. Complex Syst. 9(6), 431–454 (1995)
Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA–II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002a)
Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable multi-objective optimization test problems. In: Congress on Evolutionary Computation (CEC 2002), vol. 1, pp. 825–830 (2002b)
Hutter, F., Hoos, H.H., Leyton-Brown, K.: ParamILS: an automatic algorithm configuration framework. J. Artif. Intell. Res. 36, 267–306 (2009)
Knowles, J., Thiele, L., Zitzler, E.: A tutorial on the performance assessment of stochastic multiobjective optimizers. 214, Computer Engineering and Networks Laboratory (TIK), ETH, Zurich, Switzerland (2006), revised version
Li, H., Zhang, Q.: Multiobjective optimization problems with complicated Pareto sets, MOEA/D and NSGA-II. IEEE Trans. Evol. Comput. 12(2), 284–302 (2009)
López-Ibáñez, M., Stützle, T.: Automatic configuration of multi-objective ACO algorithms. In: ANTS 2010. Lecture Notes in Computer Science, vol. 6234, pp. 95–106 (2010)
Montgomery, D.C.: Design and Analysis of Experiments, 6th edn. Wiley, New York (2005)
Ridge, E.: Design of experiments for the tuning of optimisation algorithms. Ph.D. thesis, The University of York (2007)
Terashima-Marín, H., Zárate, C.J.F., Ross, P., Valenzuela-Rendón, M.: A GA-based method to produce generalized hyper-heuristics for the 2d-regular cutting stock problem. In: Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation (GECCO 2006) (2006)
Vázquez-Rodríguez, J.A., Petrovic, S.: A new dispatching rule based genetic algorithm for the multi-objective job shop problem. J. Heuristics 16(6), 771–793 (2010). http://dx.doi.org/10.1007/s10732-009-9120-8
Wessing, S., Beume, N., Rudolph, G., Naujoks, B.: Parameter tuning boosts performance of variation operators in multi-objective optimization. In: Schaefer, R., Cotta, C., Kolodziej, J., Rudolph, G (eds.) Parallel Problem Solving from Nature, PPSN XI. Lecture Notes in Computer Science, vol. 6238, pp. 728–737 (2010)
Zhou, A., Jin, Y., Zhang, Q., Sendhoff, B., Tsang, E.: Combining model-based and genetics-based offspring generation for multi-objective optimization using a convergence criterion. In: 2006 IEEE Congress on Evolutionary Computation, pp. 3234–3241 (2006)
Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: empirical results. Evol. Comput. 8(2), 173–195 (2000)
Zitzler, E., Künzli, S.: Indicator-based selection in multiobjective search. In: Yao, X., et al. (eds.) Parallel Problem Solving from Nature (PPSN VIII). Lecture Notes in Computer Science, vol. 3242, pp. 832–842. Springer, Berlin (2004)
Zitzler, E., Laumanns, M., Thiele, L.: SPEA2: improving the strength Pareto evolutionary algorithm for multiobjective optimization. In: Evolutionary Methods for Design, Optimisation and Control, pp. 95–100 (2002)
Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., Fonseca, V.: Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans. Evol. Comput. 7(2), 117–132 (2003)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Vázquez-Rodríguez, J.A., Petrovic, S. Calibrating continuous multi-objective heuristics using mixture experiments. J Heuristics 18, 699–726 (2012). https://doi.org/10.1007/s10732-012-9204-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10732-012-9204-8