selves non-recursive (i.e., non Turing-computable),
since the consideration of any kind of recursive evolu-
tion would necessarily restrain the corresponding net-
works to no more than Turing capabilities. Hence, ac-
cording to this model, the existence of super-Turing
potentialities of evolving neural networks depends on
the possibility for “nature” to realize non-recursive
patterns of synaptic evolution.
By contrast, in the case of real-weighted interac-
tive neural networks, the translation from the static
to the evolving framework doesn’t bring any addi-
tional computational power to the networks. In other
words, the computational capabilities brought up by
the power of the continuum cannot be overcome by
incorporating some further possibilities of synaptic
evolution in the model.
To summarize, the possibility of synaptic evolu-
tion in a basic ﬁrst-order interactive rate neural model
provides an alternative and equivalent way to the con-
sideration of analog synaptic weights towards the
achievement super-Turing computational capabilities
of neural networks. Yet even if the concepts of evo-
lution on the one hand and analog continuum on the
other hand turn out to be mathematically equivalent
in this sense, they are nevertheless conceptually well
distinct. Indeed, while the power of the continuum
is a pure conceptualization of the mind, the synap-
tic plasticity of the networks is itself something really
observable in nature.
The present work is envisioned to be extended in
three main directions. Firstly, a deeper study of the
issue from the perspective of computational complex-
ity could be of interest. Indeed, the simulation of an I-
Ev-RNN[R] N by some I-Ev-RNN[Q] N
′
described
in the proof of Proposition 1 is clearly not effective
in the sense that for any output move of N , the net-
work N
′
needs ﬁrst to decode the word w
n
of size ex-
ponential in n before being capable of providing the
same output as N . In the proof of Proposition 2, the
effectivity of the two simulations that are described
depend on the complexity of the synaptic conﬁgura-
tions N (t) of N as well as on the complexity of the
advice function α(n) of M .
Secondly, it is expected to consider more realistic
neural models capable of capturing biological mech-
anisms that are signiﬁcantly involved in the computa-
tional and dynamical capabilities of neural networks
as well as in the processing of information in the brain
in general. For instance, the consideration of biologi-
cal features such as spike timing dependent plasticity,
neural birth and death, apoptosis, chaotic behaviors of
neural networks could be of speciﬁc interest.
Thirdly, it is envision to consider more realistic
paradigms of interactive computation, where the pro-
cesses of interaction would be more elaborated and
biologically oriented, involving not only the network
and its environment, but also several distinct compo-
nents of the network as well as different aspects of the
environment.
Finally, we believe that the study of the computa-
tional power of neural networks from the perspective
of theoretical computer science shall ultimately bring
further insight towards a better understanding of the
intrinsic nature of biological intelligence.
REFERENCES
Cabessa, J. and Siegelmann, H. T. (2011a). The computa-
tional power of interactive recurrent neural networks.
Submitted to Neural Comput.
Cabessa, J. and Siegelmann, H. T. (2011b). Evolving re-
current neural networks are super-Turing. In Interna-
tional Joint Conference on Neural Networks, IJCNN
2011, pages 3200–3206. IEEE.
Goldin, D., Smolka, S. A., and Wegner, P. (2006). Inter-
active Computation: The New Paradigm. Springer-
Verlag New York, Inc., Secaucus, NJ, USA.
Kleene, S. C. (1956). Representation of events in nerve nets
and ﬁnite automata. In Automata Studies, volume 34
of Annals of Mathematics Studies. Princeton Univer-
sity Press, Princeton, NJ, USA.
McCulloch, W. S. and Pitts, W. (1943). A logical calculus
of the ideas immanent in nervous activity. Bulletin of
Mathematical Biophysic, 5:115–133.
Minsky, M. L. (1967). Computation: ﬁnite and inﬁnite ma-
chines. Prentice-Hall, Inc., Upper Saddle River, NJ,
USA.
Siegelmann, H. T. (1999). Neural networks and analog
computation: beyond the Turing limit. Birkhauser
Boston Inc., Cambridge, MA, USA.
Siegelmann, H. T. and Sontag, E. D. (1994). Analog com-
putation via neural networks. Theor. Comput. Sci.,
131(2):331–360.
Siegelmann, H. T. and Sontag, E. D. (1995). On the com-
putational power of neural nets. J. Comput. Syst. Sci.,
50(1):132–150.
van Leeuwen, J. and Wiedermann, J. (2001a). Beyond the
Turing limit: Evolving interactive systems. In SOF-
SEM 2001: Theory and Practice of Informatics, vol-
ume 2234 of LNCS, pages 90–109. Springer Berlin /
Heidelberg.
van Leeuwen, J. and Wiedermann, J. (2001b). The Tur-
ing machine paradigm in contemporary computing. In
Mathematics Unlimited - 2001 and Beyond, LNCS,
pages 1139–1155. Springer-Verlag.
van Leeuwen, J. and Wiedermann, J. (2008). How we
think of computing today. In Logic and Theory of
Algorithms, volume 5028 of LNCS, pages 579–593.
Springer Berlin / Heidelberg.
Wegner, P. (1998). Interactive foundations of computing.
Theor. Comput. Sci., 192:315–351.
INTERACTIVE EVOLVING RECURRENT NEURAL NETWORKS ARE SUPER-TURING
333