
, mutation, mate selection, population replacement,
fitness scaling, etc. proving that with these simple op-
erators a GA does not converge to a population con-
taining only optimal members. However, there are
GAs that converge to the optimum, The Elitist GA and
those which introduce Reduction Operators (Eiben
et al., 1991). Our GA combines the two balancing
goals: exploiting the blindly search like a canoni-
cal GA and using statistical properties like a standard
ICA algorithm. In order to include statistical infor-
mation into the algorithm (it would be a nonsense to
ignore it!) we define the hybrid statistical genetic op-
erator based on reduction operators as follows:
q, M
n
G
p =
1
ℵ(T
n
)
exp
||q − S
n
· p||
2
T
n
; p, q ∈ ℘
N
(4)
where ℵ(T
n
) is the normalization constant depending
on temperature T
n
, n is the iteration and S
n
is the step
matrix which contains statistical properties, i.e based
on cumulants it can be expressed using quasi-Newton
algorithms as (Hyv
¨
arinen and Oja, ):
S
n
=(I − µ
n
(C
1,β
y,y
S
β
y
− I)); p
i
∈ C (5)
where C
1,β
y,y
is the cross-cumulant matrix whose ele-
ments are [C
α,β
y,y
]
ij
=
Cum(y
i
,...,y
i
α
,y
j
,...,y
j
β
) and S
β
y
is the sign matrix
of the output cumulants.
Finally the guided GA (GGA) is modelled, at each
step, as the stochastic matrix product acting on prob-
ability distributions over populations:
G
n
= P
n
R
· F
n
· C
k
P
n
c
· M
(P
m
,G)
n
(6)
where F
n
is the selection operator, P
n
R
a reduc-
tion operator, C
k
P
n
c
is the cross-over operator and
M
(P
m
,G)
n
are the mutation and guided operators.
The GA used applies local search (using the se-
lected mutation and crossover operators) around the
values (or individuals) found to be optimal (elite) the
last time. The computational time depends on the
encoding length, number of individuals and genes.
Because of the probabilistic nature of the GA-based
method, the proposed method almost converges to a
global optimal solution on average. In our simulation
nonconvergent case was found.
5 SIMULATIONS AND
CONCLUSIONS
To check the performance of the proposed hybrid al-
gorithm, 50 computer simulations were conducted to
test the GGA vs. the GA method (without guide)
and the most relevant ICA algorithm to date, Fas-
tICA (Hyv
¨
arinen and Oja, ). In this paper we neglect
the evaluation of the computational complexity of the
current methods, described in detail in several refer-
ences such as (Tan and Wang, 2001). The main rea-
son lies in the fact that we are using a 8 nodes Cluster
Pentium II 332MHz 512Kb Cache, thus the compu-
tational requirements of the algorithms (fitness func-
tions, encoding, etc.) are generally negligible com-
pared with the cluster capacity. Logically GA-based
BSS approaches suffer from a higher computational
complexity.
Consider the mixing cases from 2 to 20 indepen-
dent random super-gaussian input signals. We focuss
our attention on the evolution of the crosstalk vs. the
number of iterations using a mixing matrix randomly
chosen in the interval [−1, +1]. The number of indi-
viduals chosen in the GA methods were N
p
=30in
the 50 (randomly mixing matrices) simulations for a
number of input sources from 2 (standard BSS prob-
lem) to 20 (BSS in biomedicine or finances). The
standard deviation of the parameters of the separation
over the 50 runs never exceeded 1% of their mean
values while using the FASTICA method we found
large deviations from different mixing matrices due
to its limited capacity of local search as dimension in-
creases. The results for the crosstalk are displayed in
Table 5. It can be seen from the simulation results
that the FASTICA convergence rate decreases as di-
mension increases whereas GA approaches work effi-
ciently.
A GGA-based BSS method has been developed to
solve BSS problem from the linear mixtures of inde-
pendent sources. The proposed method obtain a good
performance overcoming the local minima problem
over multidimensional domains (see table 5). Exten-
sive simulation results prove the ability of the pro-
posed method. This is particular useful in some
medical applications where input space dimension in-
creases and in real time applications where reaching
fast convergence rates is the major objective.
Minimizing the regularizated risk functional, using
an operator the enforce flatness in feature space, we
build a hybrid model that achieves high prediction
performance (G
´
orriz et al., 2003), comparing with the
previous on-line algorithms for time series forecast-
ing. This performance is similar to the one achieve
by SVM but with lower computational time demand,
essential feature in real-time systems. The benefits
of SVM for regression choice consist in solving a
-uniquely solvable- quadratic optimization problem,
unlike the general RBF networks, which requires suit-
able non-linear optimization with danger of getting
stuck in local minima. Nevertheless the RBF net-
works used in this paper, with the help of various
techniques obtain high performance, even under ex-
tremely volatile conditions, since the level of noise
and the change of delay operation mode applied to
the chaotic dynamics was rather high.
IMPROVING ICA ALGORITHMS APPLIED TO PREDICTING STOCK RETURNS
353