Table 1: Average Precision Results on Car Class.
Method AP(%)
Stixel + CNN 60.76
Stixel(SNV) + CNN 62.94
Ground Truth + CNN 91.14
SSD 65.20
The distance error rate was evaluated on objects
over 5 m away, and the accuracy was 92.51%.
Because we limit the length of the epipolar line to
search the corresponding points, distances less than
5 m are not reliable.
Table 2: Distance estimation error rate.
Constraint Error rate(%)
> 5 m 7.49
> 0 m 8.33
Processing time was also evaluated across
several platforms to validate the real-time system.
We checked the average processing time over five
iterations. The result on PC is 25 fps, and on the
NVIDIA TX1 board is 11 fps, as shown in Table 3.
Table 3: Processing Time.
Platform Processing Time (ms)
PC
(Titan X/i5 4670)
ROI generator 23
39
Classifier 16
NVIDIA TX1
ROI generator 51
90
Classifier 39
4 CONCLUSIONS
We introduced an integration of vehicle detection
and a distance estimation algorithm for real-time
AEB systems. Our main innovation is to share
disparity map generation, which is the most time-
consuming algorithm, for both object detection and
distance estimation. To reduce the processing time,
we use local matching, which is fast, but not very
reliable. We alleviate this problem with SNV. The
processing time satisfies real-time requirements on
PC, and almost reaches real-time on an embedded
board, the TX1. The detection performance is
reasonable when compared to the results of other
real-time detection modules.
Through the experimental results, we observed
that the proposed classifier does not fully utilize its
classification capabilities and determined that there
is room for improvement in this aspect. Future work
will include development of an improved stixel
clustering method to enable the CNN classifier
model to be fully utilized. Additionally, the CNN
classifier model can be re-designed to achieve better
performance. Our CNN model has very basic
convolutional layers, which could be replaced with a
state-of-the-art model (Szegedy et al., 2016). We
assumed the ground is flat and is estimated from a
straight line in the v-disparity. However, in the real
world, the ground is not always flat. Therefore, the
estimated line in the v-disparity should be curved to
more accurately find the ground.
ACKNOWLEDGEMENTS
This work was supported by the Industrial
Technology Innovation Program, “10052982,
Development of multi-angle front camera system for
intersection AEB,” funded by the Ministry of Trade,
Industry, & Energy (MI, Korea).
REFERENCES
I. Cabani, G. Toulminet and A. Bensrhair. (2005). Color-
based Detection of Vehicle Lights. Proceedings of the
IEEE Intelligent Vehicles Symposium.
P. Chang, D. Hirvonen, T. Camus, B. Southall. (2005).
Stereo-based Object Detection, Classification, and
Quantitative Evaluation with Automotive
Applications. Workshops in the IEEE conference on
Computer Vision and Pattern Recognition.
U. Franke, C. Rabe, H. Badino, S. Gehrig. (2005). 6D-
Vision: Fusion of Stereo and Motion for Robust
Environment Perception. Joint Pattern Recognition
Symposium.
A. Elfes. (1989). Using Occupancy Grids for Mobile
Robot Perception and Navigation. Computer.
H. Badino, U. Franke, D. Pfeiffer. (2009). The Stixel
World-A Compact Medium Level Representation of
the 3D-World. Joint Pattern Recognition Symposium.
A. Elfes. (2013). Occupancy Grids: A Stochastic Spatial
Representation for Active Robot Perception. arXiv
preprint arXiv:1304.1098.
R. Qin, J. Gong, H. Li, X. Huang. (2013). A Coarse
Elevation Map-based Registration Method for Super-
Resolution of Three-Line Scanner Images.
Photogrammetric Engineering & Remote Sensing.
B. Barrois, C. Wohler. (2013). 3D Pose Estimation of
Vehicles using Stereo Camera. Transportation
Technologies for Sustainability.
A. Barth, U. Franke. (2009). Estimating the Driving State
of Oncoming Vehicles from a Moving Platform using
Stereo Vision. IEEE Transactions on Intelligent
Transportation Systems.
A. Broggi, A. Cappalunga, C. Caraffi, S. Cattani, S.
Ghidoni, P. Grisleri, P. Porta, M. Posterli, P. Zani.
(2010). TerraMax Vision at Urban Challenge. IEEE
Transactions on Intelligent Transportation Systems.