Authors:
Federico A. Galatolo
;
Mario G. C. A. Cimino
and
Gigliola Vaglini
Affiliation:
Department of Information Engineering, University of Pisa, 56122 Pisa and Italy
Keyword(s):
Artificial Neural Networks, Recurrent Neural Network, Stigmergy, Deep Learning, Supervised Learning.
Abstract:
In this paper, a novel architecture of Recurrent Neural Network (RNN) is designed and experimented. The proposed RNN adopts a computational memory based on the concept of stigmergy. The basic principle of a Stigmergic Memory (SM) is that the activity of deposit/removal of a quantity in the SM stimulates the next activities of deposit/removal. Accordingly, subsequent SM activities tend to reinforce/weaken each other, generating a coherent coordination between the SM activities and the input temporal stimulus. We show that, in a problem of supervised classification, the SM encodes the temporal input in an emergent representational model, by coordinating the deposit, removal and classification activities. This study lays down a basic framework for the derivation of a SM-RNN. A formal ontology of SM is discussed, and the SM-RNN architecture is detailed. To appreciate the computational power of an SM-RNN, comparative NNs have been selected and trained to solve the MNIST handwritten digits
recognition benchmark in its two variants: spatial (sequences of bitmap rows) and temporal (sequences of pen strokes).
(More)