pects of the point sets we should take into account.
We have shown that not only does the discrepancy
of samples within a single generation impact perfor-
mance, but the diversity between subsequent genera-
tions also has an impact. Future work might focus on
diving deeper into the relationship between point set
size, discrepancy, diversity, and performance.
ACKNOWLEDGEMENTS
We want to thank Franc¸ois Cl
´
ement and Carola Doerr
for providing us with the optimized low-discrepancy
point sets used in this paper.
REFERENCES
Bergstra, J. and Bengio, Y. (2012). Random search for
hyper-parameter optimization. Journal of machine
learning research, 13(2).
Beyer, H. (2001). The theory of evolution strategies. Natural
computing series. Springer.
Bousquet, O., Gelly, S., Kurach, K., Teytaud, O., and Vin-
cent, D. (2017). Critical hyper-parameters: No ran-
dom, no cry. arXiv preprint arXiv:1706.03200.
Braaten, E. and Weller, G. (1979). An improved low-
discrepancy sequence for multidimensional quasi-
monte carlo integration. Journal of Computational
Physics, 33(2):249–258.
B
¨
ack, T. H. W., Kononova, A. V., van Stein, B., Wang,
H., Antonov, K. A., Kalkreuth, R. T., de Nobel, J.,
Vermetten, D., de Winter, R., and Ye, F. (2023).
Evolutionary Algorithms for Parameter Optimiza-
tion—Thirty Years Later. Evolutionary Computation,
31(2):81–122.
Cl
´
ement, F., Doerr, C., Klamroth, K., and Paquete, L.
(2023a). Constructing optimal l
∞
star discrepancy
sets. arXiv preprint arXiv:2311.17463.
Cl
´
ement, F., Doerr, C., and Paquete, L. (2024). Heuristic
approaches to obtain low-discrepancy point sets via
subset selection. Journal of Complexity, 83:101852.
Cl
´
ement, F., Vermetten, D., De Nobel, J., Jesus, A. D., Pa-
quete, L., and Doerr, C. (2023b). Computing star dis-
crepancies with numerical black-box optimization al-
gorithms. In Proceedings of the Genetic and Evolu-
tionary Computation Conference, pages 1330–1338.
de Nobel, J., Vermetten, D., Kononova, A. V., and B
¨
ack, T.
(2024). Reproducibility files and additional figures.
de Nobel, J., Vermetten, D., Wang, H., Doerr, C., and B
¨
ack,
T. (2021). Tuning as a means of assessing the ben-
efits of new ideas in interplay with existing algorith-
mic modules. In Krawiec, K., editor, GECCO ’21:
Genetic and Evolutionary Computation Conference,
Companion Volume, Lille, France, July 10-14, 2021,
pages 1375–1384. ACM.
Galanti, S. and Jung, A. (1997). Low-discrepancy se-
quences: Monte carlo simulation of option prices. The
Journal of Derivatives, 5(1):63–83.
Halton, J. H. (1960). On the efficiency of certain quasi-
random sequences of points in evaluating multi-
dimensional integrals. Numerische Mathematik, 2:84–
90.
Hansen, N., Finck, S., Ros, R., and Auger, A. (2009).
Real-parameter black-box optimization benchmark-
ing 2009: Noiseless functions definitions. Research
Report RR-6829, INRIA.
Hansen, N. and Ostermeier, A. (2001). Completely deran-
domized self-adaptation in evolution strategies. Evo-
lutionary computation, 9(2):159–195.
Loh, W.-L. (1996). On latin hypercube sampling. The an-
nals of statistics, 24(5):2058–2080.
L
´
opez-Ib
´
a
˜
nez, M., Vermetten, D., Dr
´
eo, J., and Doerr, C.
(2024). Using the empirical attainment function for
analyzing single-objective black-box optimization al-
gorithms. CoRR, abs/2404.02031.
Niederreiter, H. (1992). Random number generation and
quasi-Monte Carlo methods. SIAM.
Ostermeier, A., Gawelczyk, A., and Hansen, N. (1994). A
derandomized approach to self-adaptation of evolu-
tion strategies. Evolutionary Computation, 2(4):369–
380.
Paulin, L., Bonneel, N., Coeurjoly, D., Iehl, J.-C.,
Keller, A., and Ostromoukhov, V. (2022). Mat-
Builder: Mastering sampling uniformity over projec-
tions. ACM Transactions on Graphics (proceedings of
SIGGRAPH).
Pausinger, F. and Steinerberger, S. (2016). On the dis-
crepancy of jittered sampling. Journal of Complexity,
33:199–216.
Peart, P. (1982). The dispersion of the hammersley se-
quence in the unit square. Monatshefte f
¨
ur Mathe-
matik, 94(3):249–261.
Santner, T., Williams, B., and Notz, W. (2003). The De-
sign and Analysis of Computer Experiments. Springer
Series in Statistics, Springer.
Sobol’, I. M. (1967). On the distribution of points in a cube
and the approximate evaluation of integrals. Zhurnal
Vychislitel’noi Matematiki i Matematicheskoi Fiziki,
7(4):784–802.
Teytaud, O. (2015). Quasi-random numbers improve the
CMA-ES on the BBOB testbed. In Bonnevay, S.,
Legrand, P., Monmarch
´
e, N., Lutton, E., and Schoe-
nauer, M., editors, Artificial Evolution - 12th Inter-
national Conference, Evolution Artificielle, EA 2015,
Lyon, France, October 26-28, 2015. Revised Selected
Papers, volume 9554 of Lecture Notes in Computer
Science, pages 58–70. Springer.
Teytaud, O. and Gelly, S. (2007). DCMA: yet another
derandomization in covariance-matrix-adaptation. In
Lipson, H., editor, Genetic and Evolutionary Compu-
tation Conference, GECCO 2007, Proceedings, Lon-
don, England, UK, July 7-11, 2007, pages 955–963.
ACM.
Zhou, Y.-D., Fang, K.-T., and Ning, J.-H. (2013). Mixture
discrepancy for quasi-random point sets. Journal of
Complexity, 29(3-4):283–301.
Sampling in CMA-ES: Low Numbers of Low Discrepancy Points
125