Sampling in CMA-ES: Low Numbers of Low Discrepancy Points
Jacob de Nobel, Diederick Vermetten, Thomas Bäck, Anna Kononova
2024
Abstract
The Covariance Matrix Adaptation Evolution Strategy (CMA-ES) is one of the most successful examples of a derandomized evolution strategy. However, it still relies on randomly sampling offspring, which can be done via a uniform distribution and subsequently transforming into the required Gaussian. Previous work has shown that replacing this uniform sampling with a low-discrepancy sampler, such as Halton or Sobol sequences, can improve performance over a wide set of problems. We show that iterating through small, fixed sets of low-discrepancy points can still perform better than the default uniform distribution. Moreover, using only 128 points throughout the search is sufficient to closely approximate the empirical performance of using the complete pseudorandom sequence up to dimensionality 40 on the BBOB benchmark. For lower dimensionalities (below 10), we find that using as little as 32 unique low discrepancy points performs similar or better than uniform sampling. In 2D, for which we have highly optimized low discrepancy samples available, we demonstrate that using these points yields the highest empirical performance and requires only 16 samples to improve over uniform sampling. Overall, we establish a clear relation between the L2 discrepancy of the used point set and the empirical performance of the CMA-ES.
DownloadPaper Citation
in Harvard Style
de Nobel J., Vermetten D., Bäck T. and Kononova A. (2024). Sampling in CMA-ES: Low Numbers of Low Discrepancy Points. In Proceedings of the 16th International Joint Conference on Computational Intelligence - Volume 1: ECTA; ISBN 978-989-758-721-4, SciTePress, pages 120-126. DOI: 10.5220/0013000900003837
in Bibtex Style
@conference{ecta24,
author={Jacob de Nobel and Diederick Vermetten and Thomas Bäck and Anna Kononova},
title={Sampling in CMA-ES: Low Numbers of Low Discrepancy Points},
booktitle={Proceedings of the 16th International Joint Conference on Computational Intelligence - Volume 1: ECTA},
year={2024},
pages={120-126},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0013000900003837},
isbn={978-989-758-721-4},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 16th International Joint Conference on Computational Intelligence - Volume 1: ECTA
TI - Sampling in CMA-ES: Low Numbers of Low Discrepancy Points
SN - 978-989-758-721-4
AU - de Nobel J.
AU - Vermetten D.
AU - Bäck T.
AU - Kononova A.
PY - 2024
SP - 120
EP - 126
DO - 10.5220/0013000900003837
PB - SciTePress