COMBINATORIAL IMPLEMENTATION OF A PARAMETER-LESS
EVOLUTIONARY ALGORITHM
Gregor Papa
Computer Systems Department, Jzef Stefan Institute, Ljubljana, Slovenia
Keywords:
Parameter-less, Evolution, Search, Combinatorial, Optimization.
Abstract:
The paper presents the combinatorial implementation of an adaptive parameter-less evolutionary-based search.
The algorithm is an extension of a basic numerical algorithm that does not need any predefined control pa-
rameters values. These values are calculated by the algorithm itself, according to the progress of the search.
The efficiency of the proposed autonomous parameter-less algorithm is evaluated by two real-world industrial
optimization problems.
1 INTRODUCTION
Finding an appropriate parameter setup of an evolu-
tionary algorithm is a long standing research chal-
lenge (Eiben et al., 2007; Kang et al., 2006). Dif-
ferent optimization techniques, like genetic algorithm
(GA), differential evolution (DE), evolutionary strate-
gies (ES) or particle swarm optimization (PSO), re-
quire several important control parameters that need
to be set in advance to ensure effective optimization.
The issue of setting the values of various parameters
of an evolutionary algorithm is crucial for good per-
formance (Eiben et al., 2007). Furthermore, it has
been empirically and theoretically demonstrated, that
different values of parameters might be optimal at dif-
ferent stages of the evolutionary process (Stephens
et al., 1998; B¨ack, 1992). Therefore, particulary inter-
esting are the approaches that would solve any prob-
lem without any human intervention for setting the
suitable control parameters.
To skip the pre-setting of the control parameters a
new technique was developed, which is able to find
near-optimal solutions relatively quickly. The pre-
sented algorithm is the extension of the original algo-
rithm (Papa, 2008). This version is adapted to com-
binatorial problems. The algorithm - Parameter-Less
Evolutionary Search (PLES) - is based on basic GA,
but this algorithm does not need any control parame-
ter, e.g., population size, number of generations, prob-
abilities of crossover and mutation, to be set in ad-
vance, but are calculated during the search progress.
2 PARAMETER-LESS SEARCH
The control parameters depend on the behavior and
convergence of the found solutions. The pseudo-code
of the algorithm is presented in Figure 1. In gen-
eral, solutions are selected, and recombined in the
PLES as they are in the GA, but the implementation
of these operators is different. The elitism, selection,
and crossover are implemented through forcing-of-
better-individuals function, while mutation is split be-
tween forcing of better individuals and moving of in-
dividuals. The control parameters are never set in ad-
vance and are not constant. They are determined each
time on the basis of statistical properties of each pop-
ulation. Besides, the algorithm varies the population
size. There are some constants used in calculation of
the control parameters, but they are treated as meta-
parameters. These constants ensure large enough ini-
tial population and other parameters in order the al-
gorithm to be operational. Furthermore, these meta-
parameters are never changed - not even for different
problems.
Setup. The chromosome that represents the solu-
tion is constructed upon the number of the variables of
the problem, i.e., for the n variables the chromosome
looks like the string of n values.
Initialization. The initial population size ensures
large enough population to be able to search within
the solution space.
307
Papa G..
COMBINATORIAL IMPLEMENTATION OF A PARAMETER-LESS EVOLUTIONARY ALGORITHM.
DOI: 10.5220/0003671403070310
In Proceedings of the International Conference on Evolutionary Computation Theory and Applications (ECTA-2011), pages 307-310
ISBN: 978-989-8425-83-6
Copyright
c
2011 SCITEPRESS (Science and Technology Publications, Lda.)
Parameter-Less Evolutionary Search
Set the initial population.
Evaluate the initial population.
While stopping criterion not met do.
Force better individuals to replace worse.
Move individuals.
Evaluate the current population.
Vary population size.
Figure 1: The pseudo-code of the PLES algorithm.
PopSize
0
= 4
n+ 10log
10
(Range),
Range =
n
j=1
(max
j
min
j
+ 1)10
decplc
j
,
where n is the number of variables to be optimized,
max
j
and min
j
are the upper and the lower limit
of the j-th variable, respectively, and decplc
j
is the
number of decimal places of the j-th variable. In
the case of combinatorial optimization decplc
j
=0 and
(max
j
min
j
+ 1) is equal to the number of possi-
ble combinations of the j-th variable, combinations
j
,
therefore
Range =
n
j=1
(combinations
j
).
Stopping Criterion. The number of overall gener-
ations depends on the convergence speed of the best
solution found. Optimization proceeds while better
solution is found every few generations. But when
there is no improvement of the best solution for a
Limit (Limit = 10log
10
(PopSize
i
)) number of genera-
tions, the optimization process stops.
Variable Population Size. During the search pro-
cess the population size is changed, since the quality
of the solution might depend on the size of popula-
tion. The population size is changed every(
Limit
5
) gen-
erations, based on the average change of the standard
deviation (StDev) of fitness values of solutions over a
last few generations.
PopSize
i+1
=
PopSize
i
StDev
i1
+StDev
i2
StDev
i
+StDev
i1
The change of population is limited to
20% per change and is further limited to
[
PopSize
0
5
, 1.1 PopSize
0
]. The population shrink-
ing enables the search with smaller populations,
which is suitable to some types of the problem, or in
some stages of the search.
Forcing Better Solution. In everygenerationworse
individuals are replaced with better individuals. After
that, every s
i
j
(variable j of the solution i) is randomly
moved up to 20% of the difference between the vari-
able and the limit (upper or lower) of the variable.
s
i
j
=
s
(i1)
j
+ rnd(max
j
s
(i1)
j
) ;rnd 0
s
(i1)
j
+ rnd(s
(i1)
j
min
j
) ;rnd < 0
where rnd is a random number [-0.2,0.2].
In the case of combinatorial optimization the s
i
j
is
moved to one of the neighboring values. Here, 20%
of positions j are changed to the neighboring value
that is defined as 20% of all possible combinations of
a variable value. If in s
(i1)
the value of the variable
j is j
k
, then in s
i
the variable has the value j
(k+rnd·m)
,
where j = {j
1
, j
2
, . . . j
k
, . . . , j
m
}; variable j has m pos-
sible combinations.
Furthermore, in the case of ordered values the
move is performed by switching of randomly chosen
pairs of variables inside the solution. Again, up to
20% of variable pairs are switched.
Solution Moving. In the PLES, mutation is realized
through the moving of some positions in the chro-
mosome according to different statistical properties.
The Ratio of the parameters in the chromosome to be
moved is calculated on the basis of standard deviation
of the solutions in the previous generations as stated
in the following equation.
Ratio
i
=
StDev
i1
StDev
i3
n
where StDev
i1
and StDev
i3
are the standard devia-
tion of the solution fitness of the previous generation,
and the standard deviation three generations ago, re-
spectively. Here Ratio
i
[0. .. n], and the Ratio
i
posi-
tions in the chromosome are selected to be moved.
The size of the move is calculated according to the
difference between the value of the parameter of the
global best solution and current solution, as presented
in the following equations
MoveRatio =
s
best
j
s
(i1)
j
average
(i1)
j
s
(i1)
j
!
where s
i
j
is the value of the parameter j of the cur-
rent solution i, and s
best
j
is the value of parameter j of
the globally best solution, average
(i1)
j
is the average
value of the parameter j in the previous generation.
Width = s
best
j
s
(i1)
j
s
i
j
= s
(i1)
j
+ Direction ×MoveRatio×Width
ECTA 2011 - International Conference on Evolutionary Computation Theory and Applications
308
where Direction is randomly selected (-1 or 1).
In the case of combinatorial optimization the s
i
j
is
rounded, since only integer values are accepted.
Solution Evaluation and Statistics. After recom-
bination operations the population is statistically eval-
uated. Here, the best, the worst, and average fitness
value in the generation is found. Furthermore, the
standard deviation of fitness values of all solutions in
the generation, and the average value of each variable
are calculated.
3 INDUSTRIAL PROBLEMS
3.1 Production Planning
Optimal planning of the production in the company
that produces different components for domestic ap-
pliances has to consider various constraints. The
most demanding production stage is the production
of cooking hot-plates. The production of the com-
ponents and parts for all the types of plates is more
or less similar, but the standard plate models of the
current range differ in size (height, diameter), con-
nector type and power characteristics. Many differ-
ent models exist due to the various demands of other
companies that use those plates for their own cooking
appliances. Orders for some particular models usu-
ally vary in quantities and deadlines. Orders from the
same company are usually also connected with the
same deadline. Therefore, their production must be
planned very carefully to fulfil all the demands (quan-
tities and deadlines), to maintain the specified amount
of different models in the stock, to optimally occupy
their workers, and to efficiently use all the produc-
tion lines. Also, not all the production lines are equal,
since each of them can produce only a few differ-
ent models. The production planning problem con-
sists of finding a production plan that satisfies produc-
tion time constraints and minimizes production costs
(Koroˇsec et al., 2010). The solving involves many
specific constraints that need to be considered. The
main problem is the exchange delay, caused by adapt-
ing production lines to different types of products and
supplying the appropriate parts. The manufacturing
processes of multiple types of products requires many
different steps, and different product parts for com-
pletion of each product type. We used the combinato-
rial version of the PLES algorithm to find the optimal
production plan. For the optimal results also some lo-
cal search procedures were used with the algorithm
(Koroˇsec et al., 2010).
The cost function (Eq. (1)), as defined in (Koroˇsec
et al., 2010) considers the number of delayed orders
(n
orders
), exchange delay times (t
exchange
), overall pro-
duction time (t
overall
), and the sum of squared days of
delayed orders (n
days
).
f(P) = 10
8
·n
orders
+ 10
4
·t
exchange
+t
overall
+ n
days
. (1)
3.2 Cooling Appliance Setting
Besides the built-in functionalities, the household ap-
pliances must have low power consumption energy-
efficient optimal-performance. Optimal performance
means that the appliance is able to cool to the de-
sired temperature at the lowest possible power con-
sumption. Such optimization usually requires a lot
of long-term development measurements or a thor-
ough theoretical analysis of the cooling system and
the construction of a complex mathematical model for
its simulation (Angeli and Kountouriotis, 2011).
To determine the optimal performance of the re-
frigerator, we need to perform a set of development
measurements for each new type of the appliance
(Papa and Mrak, 2010). Thermal processes in cooling
systems are by nature very slow. One measurement
for determining the power consumption under stan-
dard conditions takes several days. To speed-up the
development process the temperature simulator was
used in connection with the optimizer. The simu-
lation tool simulates temperatures inside the cooling
appliance at different modes of regulation. The opti-
mizer uses an evolutionary heuristic search approach
to find the optimal set of control parameters itera-
tively over evolving generations. The mixed com-
binatorial/numerical version of the PLES algorithm
was used to find the optimal control parameters of an
appliance, that give an optimal performance with the
lowest possible power consumption.
The cost function (Eq. (2)), as defined in (Papa
and Mrak, 2010), considers power consumption in the
calculation interval, the length of the calculation in-
terval, the mean and the reference temperature in C1
and C2 cabinets (meanT and refT), and the standard
deviation (σ
ON
and σ
OFF
) of ON and OFF times for
the compressor.
f(P) =
2800consumption
interval
12000length
interval
+|meanT
C1
refT
C1
|
+|meanT
C2
refT
C2
|
+
σ
ON
1000
+
σ
OFF
1000
. (2)
COMBINATORIAL IMPLEMENTATION OF A PARAMETER-LESS EVOLUTIONARY ALGORITHM
309
4 RESULTS
The PLES run 20-times for each problem. The exper-
iments were done on 2.2GHz computer, and each run
took approximately 5 minutes for production plan-
ning, and less than 1 minute for cooling appliance
optimization. However, time complexity was not the
subject of this evaluation.
4.1 Production Plan
The PLES algorithm was tested on two different real
order lists from production company. The Task 1 con-
sists of n = 711 orders for 251 products, while the
Task 2 consists of n = 737 orders for 262 products. In
both tasks m = 5 production lines are available. The
number of evaluations was limited to 500, 000.
While the PLES does not need any parameter,
the comparing GA used the following parameters:
the population size N = 100; the number of gener-
ations was 5, 000; the replacement rate r = 0.2; the
crossover probability p
c
= 0.7; the mutation proba-
bility p
m
= 0.005.
In Table 1 best, mean, worst, and standard devia-
tions of solutions are presented for each task.
Table 1: Results of optimization for Task 1 and Task 2
PLES GA
Task 1 Best 1.309×10
8
1.308×10
8
Mean 1.327 ×10
8
1.340×10
8
Worst 1.416×10
8
1.610×10
8
StD 3.230 ×10
6
6.526×10
6
Task 2 Best 1.576×10
8
1.611×10
8
Mean 1.664 ×10
8
1.748×10
8
Worst 1.813×10
8
1.914×10
8
StD 7.390 ×10
6
7.235×10
6
When comparing with the previous approach of
production planning (Koroˇsec et al., 2010), the ex-
pert’s manual plan for those two tasks had about four-
times more delayed orders. This is significantly worse
than results obtained by both of the algorithms. Any-
way, the PLES algorithm shows faster convergence
and proves its search ability to find the solution with-
out the predefined control parameter settings.
4.2 Cooling Appliance
The PLES algorithm was used to speed-up the devel-
opment process of a new type of a cooling appliance.
The PLES was able to find the optimal set of cooling
appliance control parameters setting in all runs.
Optimization based on stochastic search method,
might give several possible solutions. Criteria for
choosing the appropriate one is not only the smallest
energy consumption and the desired temperature, but
it is also necessary to verify the behavior of the com-
ponents. Namely, frequent on/off switching of them
shortens their life cycle.
5 CONCLUSIONS
The paper presented an adaptive parameter-less evo-
lutionary search for combinatorial problems. The ef-
ficiency of the proposed parameter-less algorithm was
evaluated by two real-world industrial optimization
problems. Following the previously presented numer-
ical results, it was shown, that the combinatorial im-
plementation of the algorithm had faster convergence
than comparing algorithm and also it proved its search
ability to find the solution without the predefined con-
trol parameters. Furthermore, it was shown that the
combinatorial implementation of the algorithm is as
effective as the numerical one.
REFERENCES
Angeli, D. and Kountouriotis, P.-A. (2011). A stochastic
approach to ”dynamic-demand” refrigerator control.
Control Systems Technology, IEEE Transactions on,
PP(99):1 –12.
B¨ack, T. (1992). The interaction of mutation rate, selec-
tion, and self-adaptation within a genetic algorithm.
In M¨anner, R. and Manderick, B., editors, Proceed-
ings of the 2nd Conference on Parallel Problem Solv-
ing from Nature. North-Holland, Amsterdam.
Eiben, A., Michalewicz, Z., Schoenauer, M., and Smith, J.
(2007). Parameter control in evolutionary algorithms.
In Lobo, F., Lima, C., and Michalewicz, Z., edi-
tors, Parameter Setting in Evolutionary Algorithms,
volume 54 of Studies in Computational Intelligence,
pages 19–46. Springer Berlin / Heidelberg.
Kang, Q., Wang, L., and di Wu, Q. (2006). Research on
fuzzy adaptive optimization strategy of particle swarm
algorithm. International Journal of Information Tech-
nology, 12(3):65–77.
Koroˇsec, P., Papa, G., and Vukaˇsinovi´c, V. (2010). Appli-
cation of memetic algorithm in production planning.
In Proc. Bioinspired Optimization Methods and their
Applications, BIOMA 2010, pages 163–175.
Papa, G. (2008). Parameter-less evolutionary search. In
GECCO, pages 1133–1134.
Papa, G. and Mrak, P. (2010). Optimization of cooling ap-
pliance control parameters. In Proceedings of the 2nd
International Conference on Engineering Optimiza-
tion, EngOpt2010.
Stephens, C. R., Olmedo, I. G., Vargas, J. M., and Wael-
broeck, H. (1998). Self-adaptation in evolving sys-
tems. Artif. Life, 4:183–201.
ECTA 2011 - International Conference on Evolutionary Computation Theory and Applications
310