algorithm (1:8 ratio). The noise is due to spatial par-
tial derivatives that have a tendency to amplify minor
differences of optical flow.
Moving objects, like the bike and the white car in
the image sequence, are shown with darker red color
(meaning less threat) due to small relative velocity
with respect to the camera. This is an advantage of the
approach where estimation of looming is able to catch
the relevant threat of moving objects. This is clearly
an advantage of looming over conventional percep-
tion of depth.
5 CONCLUSIONS
Visual looming has been shown to be an important
cue for navigation and control tasks. This paper ex-
plores new ways for deriving and estimating visual
looming for general six-degrees-of-freedom motion.
The main contributions of the paper are:
1. We derived two novel analytical closed-form ex-
pressions for calculating looming for any six-
degree-of-freedom motion. These expressions in-
clude spatial derivatives of the motion field. This
approach can be applied to any relative motion be-
tween an observer and any visible point.
2. We showed the theoretical relationship of surface
normal to the values of looming.
3. We presented simulation results of the effect of
surface normals on values of calculated looming.
Quantitative calculations showed the relationship
between the angles of the surface normal relative
to the angle of the range vector and the related
effects on the accuracy of estimating looming val-
ues.
4. We demonstrated how to extract looming from op-
tical flow. The output of the RAFT model was
used to estimate looming values using expressions
that were derived in the paper.
It should also be emphasized that knowledge of range
to the 3D point, translation or rotation of the cam-
era (egomotion) is not required for the estimation of
looming and hence for autonomous navigation tasks.
ACKNOWLEDGMENTS
The authors thank Dr. Sridhar Kundur for many fruit-
ful discussion and suggestions as well as very detailed
comments and clarifications that led to meaningful
improvements of this paper. The authors also thank
Nikita Ostro and Benjamin Thaw for their thorough
review of this paper.
REFERENCES
Ache, J. M., Polsky, J., Alghailani, S., Parekh, R., Breads,
P., Peek, M. Y., Bock, D. D., von Reyn, C. R., and
Card, G. M. (2019). Neural basis for looming size and
velocity encoding in the drosophila giant fiber escape
pathway. Current Biology, 29(6):1073–1081.
Albus, J. S. and Hong, T. H. (1990). Motion, depth, and im-
age flow. In Proceedings., IEEE International Confer-
ence on Robotics and Automation, pages 1161–1170.
IEEE.
Aloimonos, Y. (1992). Is visual reconstruction necessary?
obstacle avoidance without passive ranging. Journal
of Robotic Systems, 9(6):843–858.
Cipolla, R. and Blake, A. (1992). Surface orientation and
time to contact from image divergence and deforma-
tion. In European Conference on Computer Vision,
pages 187–202. Springer.
Evans, D. A., Stempel, A. V., Vale, R., Ruehle, S., Lefler,
Y., and Branco, T. (2018). A synaptic threshold
mechanism for computing escape decisions. Nature,
558(7711):590–594.
Fabian, S. T., Sumner, M. E., Wardill, T. J., and Gonzalez-
Bellido, P. T. (2022). Avoiding obstacles while in-
tercepting a moving target: a miniature fly’s solution.
Journal of Experimental Biology, 225(4):jeb243568.
Geiger, A., Lenz, P., Stiller, C., and Urtasun, R. (2013).
Vision meets robotics: The kitti dataset. The Inter-
national Journal of Robotics Research, 32(11):1231–
1237.
Gibson, J. J. (2014). The ecological approach to visual per-
ception: classic edition. Psychology Press.
Horn, B. K. and Schunck, B. G. (1981). Determining optical
flow. Artificial intelligence, 17(1-3):185–203.
Kundur, S. R. and Raviv, D. (1999). Novel active vision-
based visual threat cue for autonomous navigation
tasks. Computer Vision and Image Understanding,
73(2):169–182.
Lee, D. N. and Reddish, P. E. (1981). Plummeting gan-
nets: A paradigm of ecological optics. Nature,
293(5830):293–294.
Meriam, J. and Kraige, L. (2012). Engineering Mechanics:
Dynamics. Engineering Mechanics. Wiley.
Muijres, F. T., Elzinga, M. J., Melis, J. M., and Dickinson,
M. H. (2014). Flies evade looming targets by exe-
cuting rapid visually directed banked turns. Science,
344(6180):172–177.
Raviv, D. (1992). A quantitative approach to looming. US
Department of Commerce, National Institute of Stan-
dards and Technology.
Raviv, D. and Joarder, K. (2000). The visual looming nav-
igation cue: A unified approach. Comput. Vis. Image
Underst., 79:331–363.
Ridwan, I. (2018). Looming object detection with event-
based cameras. University of Lethbridge (Canada).
Teed, Z. and Deng, J. (2020). Raft: Recurrent all-pairs field
transforms for optical flow. In European conference
on computer vision, pages 402–419. Springer.
Verri, A. and Poggio, T. (1986). Motion field and optical
flow: qualitative properties.
VEHITS 2023 - 9th International Conference on Vehicle Technology and Intelligent Transport Systems
52