Authors:
Tobias Bolten
1
;
Christian Neumann
1
;
Regina Pohle-Fröhlich
1
and
Klaus Tönnies
2
Affiliations:
1
Institute for Pattern Recognition, Hochschule Niederrhein, Krefeld, Germany
;
2
Department of Simulation and Graphics, University of Magdeburg, Germany
Keyword(s):
Dynamic Vision Sensor, Event Data, Instance Segmentation, Multi-Person Tracking, Dataset.
Abstract:
Compared to well-studied frame-based imagers, event-based cameras form a new paradigm. They are biologically inspired optical sensors and differ in operation and output. While a conventional frame is dense and ordered, the output of an event camera is a sparse and unordered stream of output events. Therefore, to take full advantage of these sensors new datasets are needed for research and development. Despite their ongoing use, the selection and availability of event-based datasets is currently still limited. To address this limitation, we present a technical recording setup as well as a software processing pipeline for generating event-based recordings in the context of multi-person tracking. Our approach enables the automatic generation of highly accurate instance labels for each individual output event using color features in the scene. Additionally, we employed our method to release a dataset including one to four persons addressing the common challenges arising in multi-person t
racking scenarios. This dataset contains nine different scenarios, with a total duration of over 85 minutes.
(More)