Abstract
In an environment where fitness evaluations are disturbed by noise, the selection operator is prone to errors, occasionally and intendedly selecting the worse individual. A common method to reduce the noise is to sample an individual’s fitness several times, and use the average as an estimate of the true fitness. Unfortunately, such a noise reduction is computationally rather expensive. Sequential sampling does not fix the number of samples in advance for all individuals, but instead selects samples one at a time, until a certain level of confidence is achieved. This allows to reduce the number of samples, because individuals with very different true fitness values can be compared on the basis of only few samples (as the signal-to-noise ratio is rather high in this case) while very similar individuals are evaluated often enough to guarantee the desired level of confidence. In this paper, for the case of tournament selection, we show that the use of a state-of-the-art sequential sampling procedure may save a significant portion of the fitness evaluations, without increasing the selection error. Furthermore, we design a new sequential sampling procedure and show that it saves an even larger portion of the fitness evaluations. Finally, we compare the three methods also empirically on a simple onemax function.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Fitzpatrick, J.M., Grefenstette, J.J.: Genetic algorithms in noisy environments. Machine Learning 3, 101–120 (1988)
Arnold, D.V., Beyer, H.G.: Efficiency and mutation strength adaptation of the (μ/μi,λ)-ES in a noisy environment. In: [22], pp. 39–48
Arnold, D.V., Beyer, H.G.: Local performance of the (μ/μi,λ)-ES in a noisy environment. In: Martin, W., Spears, W. (eds.) Foundations of Genetic Algorithms, pp. 127–142. Morgan Kaufmann, San Francisco (2000)
Arnold, D.V., Beyer, H.G.: A comparison of evolution strategies with other direct search methods in the presence of noise. Computational Optimization and Applications 24, 135–159 (2003)
Miller, B.L., Goldberg, D.E.: Genetic algorithms, selection schemes, and the varying effects of noise. Evolutionary Computation 4, 113–131 (1996)
Branke, J., Schmidt, C.: Selection in the presence of noise. In: Cantú-Paz, E., Foster, J.A., Deb, K., Davis, L., Roy, R., O’Reilly, U.-M., Beyer, H.-G., Kendall, G., Wilson, S.W., Harman, M., Wegener, J., Dasgupta, D., Potter, M.A., Schultz, A., Dowsland, K.A., Jonoska, N., Miller, J., Standish, R.K. (eds.) GECCO 2003. LNCS, vol. 2723, pp. 766–777. Springer, Heidelberg (2003)
Beyer, H.G.: Toward a theory of evolution strategies: Some asymptotical results from the (1+, λ)-theory. Evolutionary Computation 1, 165–188 (1993)
Hammel, U., Bäck, T.: Evolution strategies on noisy functions, how to improve convergence properties. In: Davidor, Y., Männer, R., Schwefel, H.-P. (eds.) PPSN 1994. LNCS, vol. 866, Springer, Heidelberg (1994)
Miller, B.L.: Noise, Sampling, and Efficient Genetic Algorithms. PhD thesis, Dept. of Computer Science, University of Illinois at Urbana-Champaign (1997), available as TR 97001
Arnold, D.V.: Noisy Optimization with Evolution Strategies. Kluwer, Dordrecht (2002)
Beyer, H.G.: Evolutionary algorithms in noisy environments: Theoretical issues and guidelines for practice. Computer methods in applied mechanics and engineering 186, 239–267 (2000)
Aizawa, A.N., Wah, B.W.: Scheduling of genetic algorithms in a noisy environment. Evolutionary Computation, 97–122 (1994)
Albert, L.A., Goldberg, D.E.: Efficient evaluation genetic algorithms under integrated fitness functions. Technical Report 2001024, Illinois Genetic Algorithms Laboratory, Urbana- Champaign, USA (2001)
Stagge, P.: Averaging efficiently in the presence of noise. In: Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) PPSN 1998. LNCS, vol. 1498, pp. 188–197. Springer, Heidelberg (1998)
Branke, J.: Creating robust solutions by means of an evolutionary algorithm. In: Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) PPSN 1998. LNCS, vol. 1498, pp. 119–128. Springer, Heidelberg (1998)
Branke, J., Schmidt, C., Schmeck, H.: Efficient fitness estimation in noisy environments. In: Spector, L., Goodman, E.D., Wu, A., Langdon, W.B., Voigt, H.M., Gen, M., Sen, S., Dorigo, M., Pezeshk, S., Garzon, M.H., Burke, E. (eds.) Genetic and Evolutionary Computation Conference, pp. 243–250. Morgan Kaufmann, San Francisco (2001)
Sano, Y., Kita, H.: Optimization of noisy fitness functions by means of genetic algorithms using history of search. In: [22], pp. 571–580.
Sano, Y., Kita, H., Kamihira, I., Yamaguchi, M.: Online optimization of an engine controller by means of a genetic algorithm using history of search. In: Asia-Pacific Conference on Simulated Evolution and Learning, pp. 2929–2934. Springer, Heidelberg (2000)
Tsutsui, S., Ghosh, A.: Genetic algorithms with a robust solution searching scheme. IEEE Transactions on Evolutionary Computation 1, 201–208 (1997)
Branke, J.: Evolutionary Optimization in Dynamic Environments. Kluwer Academic Publishers, Dordrecht (2001)
Kim, S.H., Nelson, B.: A fully sequential procedure for indifference-zone selection in simulation. ACM Transactions on Modelin and Computer Simulation 11, 251–273 (2001)
Schoenauer, M., Deb, K., Rudolph, G., Yao, X., Lutton, E., Merelo, J.J., Schwefel, H.P. (eds.): PPSN 2000. LNCS, vol. 1917. Springer, Heidelberg (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Branke, J., Schmidt, C. (2004). Sequential Sampling in Noisy Environments. In: Yao, X., et al. Parallel Problem Solving from Nature - PPSN VIII. PPSN 2004. Lecture Notes in Computer Science, vol 3242. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30217-9_21
Download citation
DOI: https://doi.org/10.1007/978-3-540-30217-9_21
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-23092-2
Online ISBN: 978-3-540-30217-9
eBook Packages: Springer Book Archive