Authors:
Joubert Damien
1
;
Konik Hubert
2
and
Chausse Frederic
3
Affiliations:
1
DEA-SAR, Groupe Renault, 1 Avenue du Golf, Guyancourt and France
;
2
Univ Lyon, UJM-Saint-Etienne, CNRS, Tlcom Saint-Etienne, Laboratoire Hubert Curien UMR 5516, F-42023, Saint-Etienne and France
;
3
Universit Clermont Auvergne, CNRS, SIGMA Clermont, Institut Pascal, F-63000 Clermont-Ferrand and France
Keyword(s):
Event-based Sensor, Convolutional Neural Network, SSD, Faster-RCNN, Transfer Learning.
Related
Ontology
Subjects/Areas/Topics:
Applications
;
Computer Vision, Visualization and Computer Graphics
;
Early and Biologically-Inspired Vision
;
Image and Video Analysis
;
Pattern Recognition
;
Robotics
;
Software Engineering
Abstract:
Mainly inspired by biological perception systems, event-based sensors provide data with many advantages such as timing precision, data compression and low energy consumption. In this work, it is analyzed how these data can be used to detect and classify cars, in the case of front camera automotive applications. The basic idea is to merge state of the art deep learning algorithms with event-based data integrated into artificial frames. When this preprocessing method is used in viewing purposes, it suggests that the shape of the targets can be extracted, but only when the relative speed is high enough between the camera and the targets. Event-based sensors seems to provide a more robust description of the target’s trajectory than using conventional frames, the object only being described by its moving edges, and independently of lighting conditions. It is also highlighted how features trained on conventional greylevel images can be transferred to event-based data to efficiently detect
car into pseudo images.
(More)