Authors:
Veís Oudjail
1
and
Jean Martinet
2
Affiliations:
1
Univ. Lille, CNRS, Centrale Lille, UMR 9189 – CRIStAL, F-59000, Lille, France
;
2
Université Côte d’Azur, CNRS, I3S, France
Keyword(s):
Motion Analysis, Spiking Neural Networks, Event-based Sensor, Parameter Exploration.
Abstract:
Being able to estimate motion features is an essential step in dynamic scene analysis. Optical flow typically quantifies the apparent motion of objects. Motion features can benefit from bio-inspired models of mammalian retina, where ganglion cells show preferences to global patterns of direction, especially in the four cardinal translatory directions. We study the meta-parameters of a bio-inspired motion estimation model using event cameras, that are bio-inspired vision sensors that naturally capture the dynamics of a scene. The motion estimation model is made of an elementary Spiking Neural Network, that learns the motion dynamics in a non-supervised way through the Spike-Timing-Dependent Plasticity. After short simulation times, the model can successfully estimate directions without supervision. Some of the advantages of such networks are the non-supervised and continuous learning capabilities, and also their implementability on very low-power hardware. The model is tuned using a s
ynthetic dataset generated for parameter estimation, made of various patterns moving in several directions. The parameter exploration shows that attention should be given to model tuning, and yet the model is generally stable over meta-parameter changes.
(More)