BarvEye - Bifocal Active Gaze Control for Autonomous Driving

Ernst Dieter Dickmanns

2015

Abstract

With the capability of autonomous driving for road vehicles coming closer to market introduction a critical consideration is given to the design parameters of the vision systems actually investigated. They are chosen for relatively simple applications on smooth surfaces. In the paper, this is contrasted with more demanding tasks human drivers will expect to be handled by autonomous systems in the longer run. Visual ranges of more than 200 m and simultaneous fields of view of at least 100º seem to be minimal requirements; potential viewing angles of more than 200º are desirable at road crossings and at traffic circles. Like in human vision, regions of high resolution may be kept small if corresponding gaze control is available. Highly dynamic active gaze control would also allow suppression of angular perturbations during braking or driving on rough ground. A 'Bifocal active road vehicle Eye' (BarvEye) is discussed as an efficient compromise for achieving these capabilities. For approaching human levels of performance, larger knowledge bases on separate levels for a) image features, b) objects / subjects, and c) situations in application domains have to be developed in connection with the capability of learning on all levels.

References

  1. Bayerl S.F.X., Wuensche H.-J., 2014. Detection and Tracking of Rural Crossroads Combining Vision and LiDAR Measurements. In Proc. IEEE Int'l Conf. on Intelligent Transportation Systems, 2014.
  2. Bertozzi M., Broggi A, Fascioli A., 2000. Vision-based intelligent vehicles: State of the art and perspectives. Robotics and Autonomous Systems 32, pp 1-16.
  3. Burt P., Wixson L., Salgian G., 1995. Electronically directed “focal” stereo. Proc., Fifth Internat. Conf. on Computer Vision, pp 94-101.
  4. Dickmanns E.D, 1987: 4-D-Dynamic Scene Analysis with Integral Spatio-Temporal Models. In: Bolles RC, Roth B. 1988. Robotics Research, MIT Press, Cambridge.
  5. Dickmanns E.D., Graefe V., 1988. a) Dynamic monocular machine vision. Machine Vision and Applications, Springer International, Vol. 1, pp 223-240. b) Applications of dynamic monocular machine vision. (ibid), pp 241-261.
  6. Dickmanns E.D., 2007. Dynamic Vision for Perception and Control of Motion. Springer-Verlag, (474 pp).
  7. Gehrig S., Eberli F., Meyer T., 2009. A Real-time LowPower Stereo Vision Engine Using Semi-Global Matching on an automotive compliant FPGA. ICVS.
  8. Hirschmueller H., 2011 (Sept.). Semi-Global Matching - Motivation, Developments and Applications. Photogrammetric Week, Stuttgart, Germany, pp. 173-184.
  9. IJVAS-1, 2002. Vision for ground vehicles: history and prospects. Int. Journal of Vehicle Autonomous Systems (IJVAS), Vol.1 No.1, pp 1-44.
  10. IV'00, 2000. Proc. Internat. Symp. on Intelligent Vehicles, Dearborn (MI), with six contributions to Expectationbased, Multi-focal, Saccadic (EMS-) vision:
  11. 1. Gregor R. et al.: EMS-Vision: A Perceptual System for Autonomous Vehicles.
  12. 2. Gregor R., Dickmanns E.D.: EMS-Vision: Mission Performance on Road Networks.
  13. 3. Hofmann U.; Rieder A., Dickmanns, E.D.: EMSVision: Applic. to 'Hybrid Adaptive Cruise Control'.
  14. 4. Luetzeler M., Dickmanns E.D.: EMS-Vision: Recognition of Intersections on Unmarked Road Networks.
  15. 5. Pellkofer M., Dickmanns E.D.: EMS-Vision: Gaze Control in Autonomous Vehicles.
  16. 6. Siedersberger K.-H., Dickmanns E.D.: EMS-Vision: Enhanced Abilities for Locomotion.
  17. Matthies L., 1992. Stereo vision for planetary rovers: Stochastic modeling to near realtime implementation. IJCV, vol. 8.
  18. Niebles J.C., Han B., Li Fei-Fei, 2010. Efficient Extraction of Human Motion Volumes by Tracking. IEEE Computer Vision and Pattern Recogn. (CVPR).
  19. Pellkofer M., Luetzeler M., Dickmanns E.D., 2001. Interaction of Perception and Gaze Control in Autonomous Vehicles. Proc. SPIE: Intelligent Robots and Computer Vision XX; Newton, USA, pp 1-12.
  20. Roland A., Shiman P., 2002. Strategic Computing: DARPA and the Quest for Machine Intelligence, 1983-1993. MIT Press.
  21. Tsugawa S., 1994. Vision-based vehicles in Japan: Machine vision systems and driving control systems. IEEE Trans. Industr. Electronics 41(4), pp. 398-405.
  22. Unterholzner A., Wuensche H-J. 2013. Selective Attention for Detection and Tracking of Road-Networks in Autonomous Driving. IEEE Int. Symp. Intelligent Vehicles (IV'13), Gold Coast, Australia.
Download


Paper Citation


in Harvard Style

Dickmanns E. (2015). BarvEye - Bifocal Active Gaze Control for Autonomous Driving . In Proceedings of the 10th International Conference on Computer Vision Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2015) ISBN 978-989-758-091-8, pages 428-436. DOI: 10.5220/0005258904280436


in Bibtex Style

@conference{visapp15,
author={Ernst Dieter Dickmanns},
title={BarvEye - Bifocal Active Gaze Control for Autonomous Driving},
booktitle={Proceedings of the 10th International Conference on Computer Vision Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2015)},
year={2015},
pages={428-436},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005258904280436},
isbn={978-989-758-091-8},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 10th International Conference on Computer Vision Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2015)
TI - BarvEye - Bifocal Active Gaze Control for Autonomous Driving
SN - 978-989-758-091-8
AU - Dickmanns E.
PY - 2015
SP - 428
EP - 436
DO - 10.5220/0005258904280436