Table 4: Computation times needed for calculations per
event.
algorithm computational time [µs]
DS 77.465 ±32.540
LK Backward 312.974 ±170.568
LK Central 384.489 ±184.946
LK Savitzky-Golay 264.913 ±94.412
LP Single Fit 173.89 ±120.973
LP Savitsky-Golay 129.749 ±112.549
LP regularized 536.486 ±173.914
integrating this algorithm in a more complex scheme.
The Lucas-Kanade algorithm provides relatively good
error accuracy but is not the best in terms of compu-
tational cost. Presenting Savitsky-Golay filter for any
algorithm (Lucas-Kanade or Local Plane fit) always
refines the accuracy while significantly reducing com-
putation time.
7 CONCLUSION
In this paper, we present a methodology to compare
state-of-the-art event-based optical flow algorithms
and show their performance in the context of robotic
applications. The suggested evaluation led us to pro-
pose an event-based optical flow ground-truth data-
set using a VICON system. Our study reveals that
all the evaluated algorithms need a lot of tuning w.r.t
the time interval to calculate optical flow while also
tuning many thresholds to get the best optical flow
values. Our future work will then focus on proposing
adaptive solutions to make these algorithms perform
better for various scenes in robotic applications while
improving their global performance.
REFERENCES
Baker, S., Scharstein, D., Lewis, J., Roth, S., Black, M. J.,
and Szeliski, R. (2011). A database and evaluation
methodology for optical flow. International journal
of computer vision, 92(1):1–31.
Barranco, F., Fermuller, C., Aloimonos, Y., and Delbruck,
T. (2016). A dataset for visual navigation with neuro-
morphic methods. Frontiers in neuroscience, 10:49.
Benosman, R., Clercq, C., Lagorce, X., Ieng, S.-H., and
Bartolozzi, C. (2013). Event-based visual flow. IEEE
transactions on neural networks and learning systems,
25(2):407–417.
Benosman, R., Ieng, S.-H., Clercq, C., Bartolozzi, C.,
and Srinivasan, M. (2012). Asynchronous frameless
event-based optical flow. Neural Networks, 27:32–37.
Censi, A. and Scaramuzza, D. (2014). Low-latency event-
based visual odometry. In 2014 IEEE International
Conference on Robotics and Automation (ICRA),
pages 703–710. IEEE.
Delbr
¨
uck, T. (2008). Frame-free dynamic digital vision. In
Proceedings of International Symposium on Secure-
Life Electronics, Advanced Electronics for Quality
Life and Society, Univ. of Tokyo, Mar. 6-7, 2008, pages
21–26. nternational Symposium on Secure-Life Elec-
tronics, Advanced Electronics for . . . .
El-Diasty, M. and Pagiatakis, S. (2010). Calibration and
stochastic m odelling of i nertial na vigation s ensor er-
rors. Positioning (POS) Journal Information, page 80.
El-Sheimy, N., Hou, H., and Niu, X. (2007). Analysis
and modeling of inertial sensors using allan variance.
IEEE Transactions on instrumentation and measure-
ment, 57(1):140–149.
Furgale, P., Maye, J., Rehder, J., and Schneider, T. (2014).
Kalibr: A unified camera/imu calibration toolbox.
Heeger, D. J. and Jepson, A. D. (1992). Subspace methods
for recovering rigid motion i: Algorithm and imple-
mentation. International Journal of Computer Vision,
7(2):95–117.
Horn, B. K. P. (1986). Closed-form solution of absolute
orientation using uni t quaternions.
Lichtsteiner, P., Posch, C., and Delbruck, T. (2008). A
128×128 120 db 15µ s latency asynchronous tempo-
ral contrast vision sensor. IEEE journal of solid-state
circuits, 43(2):566–576.
Lucas, B. D. and Kanade, T. (1981). An iterative image
registration technique with an application to stereo vi-
sion. In IJCAI.
Menze, M., Heipke, C., and Geiger, A. (2015). Joint 3d esti-
mation of vehicles and scene flow. In ISPRS Workshop
on Image Sequence Analysis (ISA).
Mueggler, E., Forster, C., Baumli, N., Gallego, G., and
Scaramuzza, D. (2015). Lifetime estimation of events
from dynamic vision sensors. In 2015 IEEE in-
ternational conference on Robotics and Automation
(ICRA), pages 4874–4881. IEEE.
Rueckauer, B. and Delbruck, T. (2016a). Evaluation of
event-based algorithms for optical flow with ground-
truth from inertial measurement sensor. Frontiers in
neuroscience, 10:176.
Rueckauer, B. and Delbruck, T. (2016b). Evaluation of
event-based algorithms for optical flow with ground-
truth from inertial measurement sensor. Frontiers in
neuroscience, 10:176.
Sola, J. (2017). Quaternion kinematics for the error-state
kalman filter.
Wahba, G. (1965). A least squares estimate of satellite atti-
tude. SIAM review, 7(3):409–409.
Data-set for Event-based Optical Flow Evaluation in Robotics Applications
489