Beyer, H.-G., Olhofer, M., and Sendhoff, B. (2004). On the
impact of systematic noise on the evolutionary opti-
mization performance - a sphere model analysis, ge-
netic programming and evolvable machines, vol. 5,
no. 4, pp. 327 360.
Broyden., C. G. (1970). The convergence of a class of
double-rank minimization algorithms 2, the new algo-
rithm. j. of the inst. for math. and applications, 6:222-
231.
Byrd, R., Lu, P., Nocedal, J., and C.Zhu (1995). A limited
memory algorithm for bound constrained optimiza-
tion. SIAM J. Scientific Computing, vol.16, no.5.
Cervellera, C. and Muselli, M. (2003). A deterministic
learning approach based on discrepancy. In Proceed-
ings of WIRN’03, pp53-60.
Collobert, R. and Bengio, S. (2001). Svmtorch: Support
vector machines for large-scale regression problems.
Journal of Machine Learning Research, 1:143–160.
Conn, A., Scheinberg, K., and Toint, L. (1997). Re-
cent progress in unconstrained nonlinear optimization
without derivatives.
DeJong, K. A. (1992). Are genetic algorithms function op-
timizers ? In Manner, R. and Manderick, B., editors,
Proceedings of the 2
nd
Conference on Parallel Prob-
lems Solving from Nature, pages 3–13. North Holland.
Fitzpatrick, J. and Grefenstette, J. (1988). Genetic algo-
rithms in noisy environments, in machine learning:
Special issue on genetic algorithms, p. langley, ed.
dordrecht: Kluwer academic publishers, vol. 3, pp.
101 120.
Fletcher, R. (1970). A new approach to variable-metric al-
gorithms. computer journal, 13:317-322.
Gagn
´
e, C. (2005). Openbeagle 3.1.0-alpha.
Gelly, S., Ruette, S., and Teytaud, O. (2006). Comparison-
based algorithms: worst-case optimality, optimality
w.r.t a bayesian prior, the intraclass-variance mini-
mization in eda, and implementations with billiards.
In PPSN-BTP workshop.
Goldfarb, D. (1970). A family of variable-metric algorithms
derived by variational means. mathematics of compu-
tation, 24:23-26.
Hansen, N. and Ostermeier, A. (1996). Adapting arbi-
trary normal mutation distributions in evolution strate-
gies: The covariance matrix adaption. In Proc. of the
IEEE Conference on Evolutionary Computation (CEC
1996), pages 312–317. IEEE Press.
Hickernell, F. J. (1998). A generalized discrepancy and
quadrature error bound. Mathematics of Computation,
67(221):299–322.
Hooke, R. and Jeeves, T. A. (1961). Direct search solution
of numerical and statistical problems. Journal of the
ACM, Vol. 8, pp. 212-229.
Jin, Y. and Branke, J. (2005). Evolutionary optimization in
uncertain environments. a survey, ieee transactions on
evolutionary computation, vol. 9, no. 3, pp. 303 317.
Kaupe, A. F. (1963). Algorithm 178: direct search. Com-
mun. ACM, 6(6):313–314.
Keijzer, M., Merelo, J. J., Romero, G., and Schoenauer, M.
(2001). Evolving objects: A general purpose evolu-
tionary computation library. In Artificial Evolution,
pages 231–244.
LaValle, S. M., Branicky, M. S., and Lindemann, S. R.
(2004). On the relationship between classical grid
search and probabilistic roadmaps. I. J. Robotic Res.,
23(7-8):673–692.
L’Ecuyer, P. and Lemieux, C. (2002). Recent advances in
randomized quasi-monte carlo methods. pages 419–
474.
Lindemann, S. R. and LaValle, S. M. (2003). Incremen-
tal low-discrepancy lattice methods for motion plan-
ning. In Proceedings IEEE International Conference
on Robotics and Automation, pages 2920–2927.
Niederreiter, H. (1992). Random Number Generation and
Quasi-Monte Carlo Methods. SIAM.
Owen, A. (2003). Quasi-Monte Carlo Sampling, A Chapter
on QMC for a SIGGRAPH 2003 course.
Sendhoff, B., Beyer, H.-G., and Olhofer, M. (2004). The
influence of stochastic quality functions on evolution-
ary search, in recent advances in simulated evolution
and learning, ser. advances in natural computation, k.
tan, m. lim, x. yao, and l. wang, eds. world scientific,
pp 152-172.
Shanno, D. F. (1970.). Conditioning of quasi-newton meth-
ods for function minimization. mathematics of com-
putation, 24:647-656.
Sloan, I. and Wo
´
zniakowski, H. (1998). When are quasi-
Monte Carlo algorithms efficient for high dimensional
integrals? Journal of Complexity, 14(1):1–33.
Sutton, R. and Barto, A. (1998). Reinforcement learning:
An introduction. MIT Press., Cambridge, MA.
Tsutsui, S. (1999). A comparative study on the effects
of adding perturbations to phenotypic parameters in
genetic algorithms with a robust solution searching
scheme, in proceedings of the 1999 ieee system, man,
and cybernetics conference smc 99, vol. 3. ieee, pp.
585 591.
Tuffin, B. (1996). On the use of low discrepancy sequences
in monte carlo methods. In Technical Report 1060,
I.R.I.S.A.
Wasilkowski, G. and Wozniakowski, H. (1997). The expo-
nent of discrepancy is at most 1.4778. Math. Comp,
66:1125–1132.
Wright, M. (1995). Direct search methods: Once
scorned, now respectable. Numerical Analysis
(D. F. Griffiths and G. A. Watson, eds.), Pitman
Research Notes in Mathematics, pages 191–208.
http://citeseer.ist.psu.edu/wright95direct.html.
Zhu, C., Byrd, R., P.Lu, and Nocedal, J. (1994). L-BFGS-B:
a limited memory FORTRAN code for solving bound
constrained optimization problems. Technical Report,
EECS Department, Northwestern University.
NONLINEAR PROGRAMMING IN APPROXIMATE DYNAMIC PROGRAMMING - Bang-bang Solutions,
Stock-management and Unsmooth Penalties
53