The evolution of the model relies on how the coding
of the set of perceptions (data space in the figure) that
the brain will be store within the S level and on how
the corresponding B level will be modeled. Thus the
general approach based on the use of RNNs can be
view as an application of the S[B] paradigm.
To facilitate the concrete application of the S[B]
paradigm we are going to use, as a collection of data,
the set of electrical signals of the brain (time series),
that are the proofs of the brain activity. They con-
tain all the information regarding the (topological net-
work, such as the) internal states of the brain and
its connectivity. They are the input data for train-
ing a LSTM RNN whose topology will be measured
to characterized the adaptation phase of the proposed
model. According to the work of Giles and others
(Giles et al., 1992) a second-order recurrent neural
network can be mapped into a Final State Automa-
ton (DFA), likewise the LSTM RNNs will be likely
mapped into the classes of DFAs whose correspond-
ing set of accepted regular terms will be used to
describe the cerebral activities and discriminate the
physiological from the pathological one.
The real data over which we will experiment
the proposed approach are the electrocorticographi-
cal signals: they are recorded using a new device,
the ECOGIW-16E (Piangerelli et al., 2014; Cristiani
et al., 2012), developed by two Italian companies:
AB-MEDICA s.p.a. and Aethra Telecommunications
s.r.l.. the device, completely wireless will provide a
huge amount of data, continuously recording electri-
cal brain activity.
4 FINAL REMARKS
Starting from the description of the human brain as
a self-adaptive system and exploiting the features of
the second-order LSTM-RNNs, we proposed to de-
velop a hierarchical model made up by two levels, S
the structural one and B the behavioral one, entangled
via a unique adaptation phase. The study of the evo-
lution of the model, rested on the adaptation phase, is
characterized by the way in which the space of data
is analyzed. In our scenario, the space of data repre-
sents the environment, the set of perceptions through
which the behavior of a system evolves and its knowl-
edge updates. We proposed to analyze the set of data
by an RNN for deriving the so called weight matrix
that allows us to build the corresponding complex net-
work that represents the emerging model at the cur-
rent time. In the study of the evolution of the com-
plex network, we aim to consider the recent impor-
tant results reached by some researchers of the TOP-
DRIM project
1
(Franzosi et al., 2014): a geometric
entropy measuring networks complexity for detect-
ing the emergence of the “giant component” as the
emergence of a neurological disorders such as epilep-
tic seizures.
ACKNOWLEDGEMENTS
We would like to thank to Regione Marche and Aethra
Telecommunications s.r.l. for partially funding Pian-
gerelli’s Ph.D. studies with the Eureka Project.
We also acknowledge the financial support of the
Future and Emerging Technologies (FET) programme
within the Seventh Framework Programme (FP7) for
Research of the European Commission, under the
FET-Proactive grant agreement TOPDRIM, number
FP7-ICT-318121.
REFERENCES
Ashby, M. C. and Isaac, J. T. R. (2011). Maturation of a
recurrent excitatory neocortical circuit by experience-
dependent unsilencing of newly formed dendritic
spines. Neuron, 70(3):510–21.
Bruni, R., Corradini, A., Gadducci, F., Lluch Lafuente, A.,
and Vandin, A. (2012). A conceptual framework for
adaptation. In Fundamental Approaches to Software
Engineering, volume 7212 of Lecture Notes in Com-
puter Science, pages 240–254. Springer.
Cheng, B., Lemos, R. D., and Giese, H. (2009). Soft-
ware engineering for self-adaptive systems: A re-
search roadmap. Software engineering for . . . , pages
1–26.
Cristiani, P., Marchetti, S., Paris, A., Romanelli, P., and
Sebastiano, F. (2012). Implantable device for ac-
quisition and monitoring of brain bioelectric signals
and for intracranial stimulation. WO Patent App.
PCT/IB2012/051,909.
Franzosi, R., Felice, D., Mancini, S., and Pettini, M. (2014).
A geometric entropy measuring network comcomplex
(submitted).
Funahashi, K. and Nakamura, Y. (1993). Approximation of
dynamical systems by continuous time recurrent neu-
ral networks. Neural Networks, 6(6):801–806.
Giles, C. L., Miller, C. B., Chen, D., Chen, H.-H., Sun,
G.-Z., and Lee, Y.-C. (1992). Learning and extract-
ing finite state automata with second-order recurrent
neural networks. Neural Computation, 4(3):393–405.
Hochreiter, S. and Schmidhuber, J. (1997). LONG SHORT
- TERM MEMORY. 9(8):1–32.
Khakpour, N., Jalili, S., Talcott, C., Sirjani, M., and
Mousavi, M. (2012). Formal modeling of evolv-
ing self-adaptive systems. Sci. Comput. Program.,
78(1):3–26.
1
www.topdrim.eu
NCTA2014-InternationalConferenceonNeuralComputationTheoryandApplications
360