Evolution Strategies and Covariance Matrix Adaptation - Investigating New Shrinkage Techniques

Silja Meyer-Nieberg, Erik Kropat


This paper discusses the covariance matrix adaptation in evolution strategies, a central and essential mechanism for the search process. Basing the estimation of the covariance matrix on small samples w.r.t. the search space dimension is known to be problematic. However, this situation is common in optimization raising the question, whether the performance of the evolutionary algorithms could be improved. In statistics, several approaches have been developed recently to improve the quality of the maximum-likelihood estimate. However, they are seldom applied in evolutionary computation. Here, we focus on linear shrinkage which requires relatively little additional effort. Several approaches and shrinkage targets are integrated into evolution strategies and analyzed in a series of experiments.


  1. Beyer, H.-G. and Meyer-Nieberg, S. (2006). Self-adaptation of evolution strategies under noisy fitness evaluations. Genetic Programming and Evolvable Machines, 7(4):295-328.
  2. Beyer, H.-G. and Sendhoff, B. (2008). Covariance matrix adaptation revisited - the CMSA evolution strategy -. In Rudolph, G. et al., editors, PPSN, volume 5199 of Lecture Notes in Computer Science, pages 123-132. Springer.
  3. Chen, X., Wang, Z., and McKeown, M. (2012). Shrinkageto-tapering estimation of large covariance matrices. Signal Processing, IEEE Transactions on, 60(11):5640-5656.
  4. Chen, Y., Wiesel, A., Eldar, Y. C., and Hero, A. O. (2010). Shrinkage algorithms for MMSE covariance estimation. IEEE Transactions on Signal Processing, 58(10):5016-5029.
  5. Dong, W. and Yao, X. (2007). Covariance matrix repairing in gaussian based EDAs. In Evolutionary Computation, 2007. CEC 2007. IEEE Congress on, pages 415-422.
  6. Finck, S., Hansen, N., Ros, R., and Auger, A. (2010). Real-parameter black-box optimization benchmarking 2010: Presentation of the noiseless functions. Technical report, Institute National de Recherche en Informatique et Automatique. 2009/22.
  7. Fisher, T. J. and Sun, X. (2011). Improved Stein-type shrinkage estimators for the high-dimensional multivariate normal covariance matrix. Computational Statistics & Data Analysis, 55(5):1909 - 1918.
  8. Hansen, N. (2006). The CMA evolution strategy: A comparing review. In Lozano, J. et al., editors, Towards a new evolutionary computation. Advances in estimation of distribution algorithms, pages 75-102. Springer.
  9. Hansen, N. (2008). Adaptive encoding: How to render search coordinate system invariant. In Rudolph, G., Jansen, T., Beume, N., Lucas, S., and Poloni, C., editors, Parallel Problem Solving from Nature PPSN X, volume 5199 of Lecture Notes in Computer Science, pages 205-214. Springer Berlin Heidelberg.
  10. Hansen, N., Auger, A., Finck, S., and Ros, R. (2012). Real-parameter black-box optimization benchmarking 2012: Experimental setup. Technical report, INRIA.
  11. Hansen, N., Auger, A., Ros, R., Finck, S., and Pos?ík, P. (2010). Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009. In Proceedings of the 12th annual conference companion on Genetic and evolutionary computation, GECCO 7810, pages 1689-1696, New York, NY, USA. ACM.
  12. Kramer, O. (2015). Evolution strategies with Ledoit-Wolf covariance matrix estimation. In 2015 IEEE Congress on Evolutionary Computation (IEEE CEC).
  13. Ledoit, O. and Wolf, M. (2003). Improved estimation of the covariance matrix of stock returns with an application to portfolio selection. Journal of Empirical Finance, 10(5):603-621.
  14. Ledoit, O. and Wolf, M. (2004a). Honey, I shrunk the sample covariance matrix. The Journal of Portfolio Management, 30(4):110-119.
  15. Ledoit, O. and Wolf, M. (2004b). A well-conditioned estimator for large dimensional covariance matrices. Journal of Multivariate Analysis Archive, 88(2):265- 411.
  16. Ledoit, O. and Wolf, M. (2012). Non-linear shrinkage estimation of large dimensional covariance matrices. The Annals of Statistics, 40(2):1024-1060.
  17. Ledoit, O. and Wolf, M. (2014). Nonlinear shrinkage of the covariance matrix for portfolio selection: Markowitz meets goldilocks. Available at SSRN 2383361.
  18. Meyer-Nieberg, S. and Beyer, H.-G. (2005). On the analysis of self-adaptive recombination strategies: First results. In McKay, B. et al., editors, Proc. 2005 Congress on Evolutionary Computation (CEC'05), Edinburgh, UK, pages 2341-2348, Piscataway NJ. IEEE Press.
  19. Meyer-Nieberg, S. and Beyer, H.-G. (2007). Self-adaptation in evolutionary algorithms. In Lobo, F., Lima, C., and Michalewicz, Z., editors, Parameter Setting in Evolutionary Algorithms, pages 47-76. Springer Verlag, Heidelberg.
  20. Meyer-Nieberg, S. and Kropat, E. (2014). Adapting the covariance in evolution strategies. In Proceedings of ICORES 2014, pages 89-99. SCITEPRESS.
  21. Meyer-Nieberg, S. and Kropat, E. (2015a). A new look at the covariance matrix estimation in evolution strategies. In Pinson, E., Valente, F., and Vitoriano, B., editors, Operations Research and Enterprise Systems, volume 509 of Communications in Computer and Information Science, pages 157-172. Springer International Publishing.
  22. Meyer-Nieberg, S. and Kropat, E. (2015b). Small populations, high-dimensional spaces: Sparse covariance matrix adaptation. In Computer Science and Information Systems (FedCSIS), 2015 Federated Conference on, pages 525-535.
  23. Meyer-Nieberg, S. and Kropat, E. (2015c). Sparse covariance matrix adaptation techniques for evolution strategies. In Bramer, M. and Petridis, M., editors, Research and Development in Intelligent Systems XXXII, pages 5-21. Springer International Publishing.
  24. Pourahmadi, M. (2013). High-Dimensional Covariance Estimation: With High-Dimensional Data. John Wiley & Sons.
  25. Rechenberg, I. (1973). Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Frommann-Holzboog Verlag, Stuttgart.
  26. Schäffer, J. and Strimmer, K. (2005). A shrinkage approach to large-scale covariance matrix estimation and implications for functional genomics,. Statistical Applications in Genetics and Molecular Biology, 4(1):Article 32.
  27. Schwefel, H.-P. (1981). Numerical Optimization of Computer Models. Wiley, Chichester.
  28. Stein, C. (1956). Inadmissibility of the usual estimator for the mean of a multivariate distribution. In Proc. 3rd Berkeley Symp. Math. Statist. Prob. 1, pages 197-206. Berkeley, CA.
  29. Stein, C. (1975). Estimation of a covariance matrix. In Rietz Lecture, 39th Annual Meeting. IMS, Atlanta, GA.
  30. Thomaz, C. E., Gillies, D., and Feitosa, R. (2004). A new covariance estimate for bayesian classifiers in biometric recognition. Circuits and Systems for Video Technology, IEEE Transactions on, 14(2):214-223.
  31. Tong, T., Wang, C., and Wang, Y. (2014). Estimation of variances and covariances for high-dimensional data: a selective review. Wiley Interdisciplinary Reviews: Computational Statistics, 6(4):255-264.
  32. Touloumis, A. (2015). Nonparametric Stein-type shrinkage covariance matrix estimators in high-dimensional settings. Computational Statistics & Data Analysis, 83:251-261.

Paper Citation

in Harvard Style

Meyer-Nieberg S. and Kropat E. (2016). Evolution Strategies and Covariance Matrix Adaptation - Investigating New Shrinkage Techniques . In Proceedings of the 8th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART, ISBN 978-989-758-172-4, pages 105-116. DOI: 10.5220/0005703201050116

in Bibtex Style

author={Silja Meyer-Nieberg and Erik Kropat},
title={Evolution Strategies and Covariance Matrix Adaptation - Investigating New Shrinkage Techniques},
booktitle={Proceedings of the 8th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART,},

in EndNote Style

JO - Proceedings of the 8th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART,
TI - Evolution Strategies and Covariance Matrix Adaptation - Investigating New Shrinkage Techniques
SN - 978-989-758-172-4
AU - Meyer-Nieberg S.
AU - Kropat E.
PY - 2016
SP - 105
EP - 116
DO - 10.5220/0005703201050116