Model Guided Sampling Optimization for Low-dimensional Problems

Lukáš Bajer, Martin Holeňa

2015

Abstract

Optimization of very expensive black-box functions requires utilization of maximum information gathered by the process of optimization. Model Guided Sampling Optimization (MGSO) forms a more robust alternative to Jones’ Gaussian-process-based EGO algorithm. Instead of EGO’s maximizing expected improvement, the MGSO uses sampling the probability of improvement which is shown to be helpful against trapping in local minima. Further, the MGSO can reach close-to-optimum solutions faster than standard optimization algorithms on low dimensional or smooth problems.

References

  1. Bajer, L., Holen?a, M., and Charypar, V. (2013). Improving the model guided sampling optimization by model search and slice sampling. In Vinar, T. e. a., editor, ITAT 2013 - Workshops, Posters, and Tutorials, pages 86-91. CreateSpace Indp. Publ. Platform.
  2. Buche, D., Schraudolph, N., and Koumoutsakos, P. (2005). Accelerating evolutionary algorithms with gaussian process fitness function models. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 35(2):183-194.
  3. Hansen, N., Finck, S., Ros, R., and Auger, A. (2009). Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. Technical Report RR-6829, INRIA. Updated February 2010.
  4. Hansen, N. and Ostermeier, A. (2001). Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, 9(2):159-195.
  5. Holen?a, M., Cukic, T., Rodemerck, U., and Linke, D. (2008). Optimization of catalysts using specific, description based genetic algorithms. Journal of Chemical Information and Modeling, 48:274-282.
  6. Jin, Y. (2005). A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing, 9(1):3-12.
  7. Jones, D. R. (2001). A taxonomy of global optimization methods based on response surfaces. Journal of Global Optimization, 21(4):345-383.
  8. Jones, D. R., Schonlau, M., and Welch, W. J. (1998). Efficient global optimization of expensive blackbox functions. Journal of Global Optimization, 13(4):455-492.
  9. Larrañaga, P. and Lozano, J. A. (2002). Estimation of distribution algorithms: A new tool for evolutionary computation. Kluwer.
  10. Rasmussen, C. E. and Williams, C. K. I. (2006). Gaussian Processes for Machine Learning. Adaptative computation and machine learning series. MIT Press.
Download


Paper Citation


in Harvard Style

Bajer L. and Holeňa M. (2015). Model Guided Sampling Optimization for Low-dimensional Problems . In Proceedings of the International Conference on Agents and Artificial Intelligence - Volume 2: ICAART, ISBN 978-989-758-074-1, pages 451-456. DOI: 10.5220/0005222404510456


in Bibtex Style

@conference{icaart15,
author={Lukáš Bajer and Martin Holeňa},
title={Model Guided Sampling Optimization for Low-dimensional Problems},
booktitle={Proceedings of the International Conference on Agents and Artificial Intelligence - Volume 2: ICAART,},
year={2015},
pages={451-456},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005222404510456},
isbn={978-989-758-074-1},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Agents and Artificial Intelligence - Volume 2: ICAART,
TI - Model Guided Sampling Optimization for Low-dimensional Problems
SN - 978-989-758-074-1
AU - Bajer L.
AU - Holeňa M.
PY - 2015
SP - 451
EP - 456
DO - 10.5220/0005222404510456