In figure 4 we present the % of classification
success obtained with LDA, for raw data (black bar)
and clean data (white bar), using all 5 frequency
bands as features (see section 3.1).
Even if results are not sufficiently good, cleaning
procedure improves the results in 6.48 %, from
47.37 % to 53.85 %.
4.2 Neural Network
In recent years several classification systems have
been implemented using different techniques, such
as Neural Networks.
The widely used Neural Networks techniques are
very well known in pattern recognition applications.
An artificial neural network (ANN) is a
mathematical model that tries to simulate the
structure and/or functional aspects of biological
neural networks. It consists of an interconnected
group of artificial neurons and processes information
using a connectionist approach to computation. In
most cases an ANN is an adaptive system that
changes its structure based on external or internal
information that flows through the network during
the learning phase.
Neural networks are non-linear statistical data
modelling tools. They can be used to model complex
relationships between inputs and outputs or to find
patterns in data.
One of the simplest ANN is the so called
perceptron that consist of a simple layer that
establishes its correspondence with a rule of
discrimination between classes based on the linear
discriminator. However, it is possible to define
discriminations for non-linearly separable classes
using multilayer perceptrons (MLP).
The Multilayer Perceptron (Multilayer Perceptron,
MLP), also known as Backpropagation Net (BPN) is
one of the best known and used artificial neural
network model as pattern classifiers and functions
approximators (Lippman, 1987), (Freeman and
Skapura, 1991). It belongs to the so-called
feedforward networks class, and its topology is
composed by different fully interconnected layers of
neurons, where the information always flows from
the input layer, whose only role is to send input data
to the rest of the network, toward the output layer,
crossing all the existing layers (called hidden layers)
between the input and output. Essentially the inner
layers are responsible for carrying out information
processing, extracting features of the input data.
Although there are many variants, usually each
neuron in one layer has directed connections to the
neurons of the subsequent layer but there is no
connection or interaction between neurons on the
same layer. (Bishop, 1995) (Hush and Horne, 1993).
In this work we have used a multilayer perceptron
with one hidden layer of several different neurons
(nodes), obtained empirically in each case. Each
neuron is associated with weights and biases. These
weights and biases are set to each connections of the
network and are obtained from training in order to
make their values suitable for the classification task
between the different classes.
The number of input neurons is equal to the
number of frequency bands considered, and the
number of output neurons is just one as we needs to
discriminate between only two classes (binary
problem).
As showed before, LDA with cleaned data
obtains better results, with an improvement of 6.48
%. But for classification purposes, these results are
poor and are not useful at all. Hence, we will
conduct some experiments with neural networks,
particularly with multi-layer perceptrons as a
classification system. As now we have a non-linear
classifier we expect to increase the percentage of
classification success.
Figure 5: Classification results obtained with MLP. Black
bar corresponds to raw data (60.38 % of classification
success) and white bar to clean data (73.08 % of
classification success).
All the experiments are done with a MLP with
one hidden layer of 50 units with a logistic nonlinear
function and trained with a scaled conjugate gradient
(SCG) algorithm (Moller, 1993) to find a local
minimum of the function error function. Using SCG
algorithm we avoid the linear search per learning
iteration by using Levenberg-Marquardt way of
RawClean
BIOSIGNALS 2010 - International Conference on Bio-inspired Systems and Signal Processing
488