or moving cameras, a further adaptation to change in
contrast and luminance would be required.
Figure 9: Example of input and outputs of the EVD archi-
tecture. top left: input frame, top right: final output, bottom
left: output at original resolution, bottom middle left: out-
put at original resolution/2, bottom middle right: output at
original resolution/4, bottom right: output at original reso-
lution/8.
Figure 10: Example of input and outputs of the EVD archi-
tecture. top left: input frame, top right: final output, bottom
left: output at original resolution, bottom middle left: out-
put at original resolution/2, bottom middle right: output at
original resolution/4, bottom right: output at original reso-
lution/8.
Figure 11: Example of input and outputs of the EVD archi-
tecture. top left: input frame, top right: final output, bottom
left: output at original resolution, bottom middle left: out-
put at original resolution/2, bottom middle right: output at
original resolution/4, bottom right: output at original reso-
lution/8.
REFERENCES
Baker, S., Scharstein, D., Lewis, J., Roth, S., Black, M.,
and Szeliski, R. (2007). A database and evaluation
methodology for optical flow. In Proceedings of the
IEEE Int. Conf. Computer Vision, pages 1–8.
Dror, R. O., O’Carroll, D. C., and Laughlin, S. B. (2001).
Accuracy of velocity estimation by reichardt correla-
tors. J. Opt. Soc. Am. A, 18(2):241–252.
Harrison, R. R. (2005). A biologically inspired analog ic for
visual collision detection. IEEE Trans. Circuits Syst.
Regul. Pap., 52(11):2308–2318.
Hassenstein, B. and Reichardt, W. (1956). Functional struc-
ture of a mechanism of perception of optical move-
ment. In Rosenblith, W. A., editor, Proc. Int. Cong.
Cybern., pages 797–801, Namur.
Horridge, G. A. (1990). A template theory to relate visual
processing to digital circuitry. In Proc. R. Soc. Lond.,
volume Vol. B 239, pages 17–33.
Iida, F. and Lambrinos, D. (2000). Navigation in an au-
tonomous flying robot by using a biologically inpired
visual odometer. In Proc. SPIE Sensor Fusion and
Decentralized Control in Rob. Syst. III, volume Vol.
4196, pages 86–97.
Jun, Y., Dong-Guang, L., Hui-Min, F., and Zhi-Feng, L.
(2004). Moving objects detection by imitating bio-
logic vision based on fly’s eyes. In Proc. ROBIO 2004
IEEE Int. Conf. Rob. and Biomim., pages 763–766.
Nakamura, E., Ichimura, M., and Sawada, K. (2002). Fast
global motion estimation algorithm based on elemen-
tary motion detectors. In Proc. 2002 Int. Conf. Image
Processing, volume 2, pages II–297–II–300 vol.2.
Netter, T. and Francescini, N. (2002). A robotic aircraft that
follows terrain using a neuromorphic eye. In Proc.
IEEE/RSJ Int. Conf. Intel. Rob. and Syst., volume 1,
pages 129–134 vol.1.
Nguyen, X., Bouzerdoum, A., and Bogner, R. (1996).
Backward tracking of motion trajectories for veloc-
ity estimation. In Proc. 1996 Australian and New
Zealand Conf. Intelligent Information Systems, pages
338 – 341.
Riabinina, O. and Philippides, A. O. (2009). A model of
visual detection of angular speed for bees. J. Theor.
Biol., 257(1):61–72.
Sarpeshkar, R., Kramer, J., Indiveri, G., and Koch, C.
(1996). Analog vlsi architectures for motion process-
ing: from fundamental limits to system applications.
In Proc. IEEE, volume 84, pages 969–987.
Tianguang, Z., Haiyan, W., Borst, A., Kuhnlenz, K., and
Buss, M. (2008). An fpga implementation of insect-
inspired motion detector for high-speed vision sys-
tems. In Proc. IEEE Int. Conf. Rob. and Autom., pages
335–340.
Zanker, J. M., Srinivasan, M. V., and Egelhaaf, M. (1999).
Speed tuning in elementary motion detectors of the
correlation type. Biol. Cybern., 80(2):109–116.
VISAPP 2010 - International Conference on Computer Vision Theory and Applications
452