loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Luma Issa Abdul-Kreem 1 and Heiko Neumann 2

Affiliations: 1 Ulm Univesrsity and University of Technology, Germany ; 2 Ulm Univesrsity, Germany

Keyword(s): Event-based Vision, Optic Flow, Neuromorphic Sensor, Neural Model.

Related Ontology Subjects/Areas/Topics: Computer Vision, Visualization and Computer Graphics ; Motion, Tracking and Stereo Vision ; Optical Flow and Motion Analyses

Abstract: In this paper, we propose a new bio-inspired approach for motion estimation using a Dynamic Vision Sensor (DVS) (Lichtsteiner et al., 2008), where an event-based-temporal window accumulation is introduced. This format accumulates the activity of the pixels over a short time, i.e. several μs. The optic flow is estimated by a new neural model mechanism which is inspired by the motion pathway of the visual system and is consistent with the vision sensor functionality, where new temporal filters are proposed. Since the DVS already generates temporal derivatives of the input signal, we thus suggest a smoothing temporal filter instead of biphasic temporal filters that introduced by (Adelson and Bergen, 1985). Our model extracts motion information via a spatiotemporal energy mechanism which is oriented in the space-time domain and tuned in spatial frequency. To achieve balanced activities of individual cells against the neighborhood activities, a normalization process is carried out. We tes ted our model using different kinds of stimuli that were moved via translatory and rotatory motions. The results highlight an accurate flow estimation compared with synthetic ground truth. In order to show the robustness of our model, we examined the model by probing it with synthetically generated ground truth stimuli and realistic complex motions, e.g. biological motions and a bouncing ball, with satisfactory results. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.145.58.90

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Abdul-Kreem, L. and Neumann, H. (2015). Bio-inspired Model for Motion Estimation using an Address-event Representation. In Proceedings of the 10th International Conference on Computer Vision Theory and Applications (VISIGRAPP 2015) - Volume 1: VISAPP; ISBN 978-989-758-091-8; ISSN 2184-4321, SciTePress, pages 335-346. DOI: 10.5220/0005311503350346

@conference{visapp15,
author={Luma Issa Abdul{-}Kreem. and Heiko Neumann.},
title={Bio-inspired Model for Motion Estimation using an Address-event Representation},
booktitle={Proceedings of the 10th International Conference on Computer Vision Theory and Applications (VISIGRAPP 2015) - Volume 1: VISAPP},
year={2015},
pages={335-346},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005311503350346},
isbn={978-989-758-091-8},
issn={2184-4321},
}

TY - CONF

JO - Proceedings of the 10th International Conference on Computer Vision Theory and Applications (VISIGRAPP 2015) - Volume 1: VISAPP
TI - Bio-inspired Model for Motion Estimation using an Address-event Representation
SN - 978-989-758-091-8
IS - 2184-4321
AU - Abdul-Kreem, L.
AU - Neumann, H.
PY - 2015
SP - 335
EP - 346
DO - 10.5220/0005311503350346
PB - SciTePress