Using an Intelligent Vision System for Obstacle Detection in Winter
Condition
Marwa Ziadia
1
, Sousso Kelouwani
1
, Ali Amamou
1
, Yves Dub
´
e
1
and Kodjo Agbossou
2
1
Departement of Mechanical Enginnering, Universit
´
e du Qu
´
e
`
a Trois-Rivi
´
eres, Canada
2
Department of Electricl Engineering, Universit
´
e du Qu
´
e
`
a Trois-Rivi
´
eres, Canada
Keywords:
Vehicle Technology, Collision Avoidance, Mobileye, Advanced Driving Assistance System, Winter Naviga-
tion.
Abstract:
This paper explores the performance of an Advanced Driving Assistance System (ADAS) during navigation in
urban traffic and a winter condition. The selected ADAS technology, Mobileye, has been integrated into a hy-
drogen electric vehicle. A set of three cameras (visible spectrum) has also been installed to give a surrounding
view of the test vehicle. The tests were carried out during the dusk as well as in the night in winter condi-
tion. Using Matlab, the messages provided by Mobileye system have been analyzed. More than 2800 samples
(short sequences of 5s Mobileye messages) have been processed and compared with the corresponding video
samples recorded by the three cameras. In average, the selected ADAS device was able to provide 99% of true
positive vehicle detection and classification, even in poor ambient lighting condition in winter. However, 72%
of samples involving a pedestrian was correctly classified.
1 INTRODUCTION
The number of vehicles on the road is continuously
growing, and despite the primary safety design of
cars, the number of collisions and incidents is also
growing (Lyu et al., 2018). The World Health Or-
ganization recently reported that more than one mil-
lion persons die every year because of road accidents
(Curiel-Ramirez et al., 2018).
Traditionally, the car manufacturers provide sec-
ondary safety features that can reduce a crash impact
on passengers. One such feature is the airbags tech-
nology. However, with the recent progress on the ve-
hicle mechatronics, onboard sensors and soft comput-
ing, the primary safety features are among the manu-
facture safety priority. These features have the poten-
tial to directly mitigate the likelihood of an imminent
crash: adaptive cruise control, electronic stability, etc.
(Thompson et al., 2018).
Although the deployment of these primary safety
features, most accidents on the road are due to the
human driver distraction (Larkin, 2006; Excell, 2005;
Wright, 2016; Barclay, 2012; Chang et al., 2009). The
Advanced Driving Assistance System has emerged as
one of the promising technologies that has a great po-
tential to help the driver to avoid accidents and inci-
dents during distraction periods. Therefore, this paper
focuses on the aftermarket ADAS device (Mobileye 5
series) which is based on a monocular camera for col-
lision avoidance and mitigation.
Several studies have recently been published
which, in general, highlighted the collision-reduction
potential of the ADAS system. Hence, in (Chang
et al., 2010), a vision system combined with a GPS
sensing has been used to enhance obstacles detection
and mitigate collision occurrences. This system has
been shown to be effective in normal operating condi-
tion conditions (daytime, no winter condition, etc.).
A similar operating condition has been reported in
(Curiel-Ramirez et al., 2018; Chen et al., 2017; Ex-
cell, 2005; Fireman, 2017). In a field test, an ADAS
system has shown to improve the driver alert level and
reduce the likelihood of forward collision (Thompson
et al., 2018). The tests have been performed on a fleet
of government vehicles, and the details driving con-
ditions are not specified. Another trial that involves
an ADAS system has been performed in China (urban
and highway navigation). All the tests were done dur-
ing the daytime from 8:00 am to 17:30 pm. Although
the outcome of the study indicated that the selected
ADAS improved the drivers’ longitudinal behaviors
and significantly reduced the likelihood of a forward
collision, no winter condition has been used as a nav-
igation condition. Besides, the previously mentioned
562
Ziadia, M., Kelouwani, S., Amamou, A., Dubé, Y. and Agbossou, K.
Using an Intelligent Vision System for Obstacle Detection in Winter Condition.
DOI: 10.5220/0007766905620568
In Proceedings of the 5th International Conference on Vehicle Technology and Intelligent Transport Systems (VEHITS 2019), pages 562-568
ISBN: 978-989-758-374-2
Copyright
c
2019 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved
study included most of the reported results on the ef-
fectiveness and acceptance of ADAS. In (Birrell et al.,
2014), an ADAS system has been tested in a real-
world condition to assess if any measurable benefi-
cial changes can be observed in the driving perfor-
mance. The test scenarios include different road and
traffic types. The presented results suggested that a
smart driving assistance system can significantly im-
prove the driver behavior and can lead to a fuel-saving
too. All these tests have been carried out in normal
weather operating condition (no winter conditions).
A similar result has also been reported in (Reagan,
2019; Lee et al., 2018).
Regarding the specific problem of detecting a
pedestrian, several studies have been carried out, and
most of them have been performed in regular day-
time without winter operating condition. These stud-
ies were done on the different type of road including
urban and highway roads. Hence, in (Ke et al., 2017),
a new framework has been introduced that can ef-
fectively detect in real-time a vehicle-pedestrian near
through a single monocular camera. The presented re-
sults have been performed without considering winter
operating conditions. Chen and al. (Chen et al., 2017)
has studied the interaction of pedestrian and a vehicle
at unsignalized crossings and there were able, through
more than 2900 crossing events analysis, to build a
stochastic interaction model based on a multivariate
Gaussian mixture method. Here again, most of the
analyzed events did not include winter operating con-
ditions.
One of the most used ADAS devices in the above-
mentioned studies is Mobeleye intelligent monocular
vision system (Abelson, 2012; Wright, 2016; Wang
et al., 2017; Vasic et al., 2016; Markwalter, 2017).
Also, most of these studies were carried out without
taking into account the winter navigation condition.
Cold climate countries (Canada, Sweden, Finland,
UK, Russia, etc.) can experience harsh winter navi-
gation environment with snow and ice covered roads.
Any ADAS, to be effective, should provide collision
alert warnings for all weather conditions. Therefore,
this paper aims to explore some of the performance
of an ADAS system, when used in an urban traffic
condition during winter navigation condition (dusk as
well as nigh-time). Knowing the real-life limitation
of the selected ADAS system will allow further op-
timization to enhance its capabilities in all weather
conditions.
The rest of the paper is organized as follows. The
test setup and material are presented in section 2,
whereas the results and discussions are presented in
section 3 and finally the last section is related to the
concluding remarks and future directions.
2 MATERIALS AND SETUP
2.1 Materials
A hydrogen vehicle (see fig. 1) is retrofitted with a
Mobileye 5-series intelligent camera. We used this
specific type of vehicle because we are investigating
the energy efficiency of the fuel cell stack in winter
condition (Amamou et al., 2016; Henao et al., 2012;
Cano et al., 2014). This system uses a single camera
(visible spectrum) which is installed in the middle of
the front windscreen. Using a proprietary processor,
it can calculate different dynamic parameters related
to the vehicle motion (distance between the car and
surrounding objects which could potentially be con-
sidered as obstacles). The system includes a specific
device which serves as a display (EyeWatch) (see fig.
2).
Figure 1: The test vehicle: Hyunday Tucson Hydrogen Ve-
hicle.
The setup is shown in Fig. 3. Mobileye messages
are acquired through the Kvaser Leaf device. This
device is connected to a computer (Laptop) using one
USB port. Three GoPro cameras are also connected
to the computer.
To be able to analyze the navigation context ade-
quately and compare the provided Mobileye messages
with the human navigation scene interpretation, three
monocular GoPro cameras (visible spectrum) were
installed. One of them is put on the front windscreen
(see Fig. 4). The second one is installed on the vehicle
left side whereas the last one is put on the right side.
The three cameras provide a surrounding view that
help us to interpret each navigation scene. The videos
from these cameras are also collected and saved for
future analysis.
The GoPro cameras and the Mobileye system are
time-synchronized in order to get one common time
scale of both vision modalities.
Using an Intelligent Vision System for Obstacle Detection in Winter Condition
563
Figure 2: Mobileye Serie 5 with its small display EyeWatch
(Mobileye, 2019).
Figure 3: Setup used during the tests in winter condition.
2.2 Naturalistic Test Scenario
To assess the behavior of the intelligent vision sys-
tem, several tests have been performed during winter
2018 in Canadian urban traffic. The test vehicle used
the road shown in Fig. 5. The weather was cloudy
at dusk. In the evening and in the night, there is a
mixture of rain and snowfall. The visibility is almost
poor. No extra warnings were given to the driver and
the road users are most likely to be students (going
back to home after classes), workers and teachers (go-
ing back at the end of the day), nurses, physicians, and
other pedestrians. In addition, the driver was asked to
Figure 4: Camera GoPro installed in front of the test vehicle
(middle of the windshield).
ignore the Mobileye messages and drives the car as
usual. All grabbed data (from Mobileye Canbus as
well as the GoPro cameras) are synchronized to get
the same time reference for further analysis.
The data were segmented into very short data-
stream of 5s. The idea is to sample the whole urban
trip into short motion within the framework of urban
navigation. Globally, 2880 samples have been ana-
lyzed. For each sample, the analysis consists in:
processing the sample of Mobileye log file with
its corresponding short video;
identifying the number of detected obstacles in the
Mobileye messages stored in the log file;
for each identified obstacle, check the type of ob-
stacle among vehicle, truck, bike, pedestrian, and
bicycle;
playbacks the corresponding video record from
the GoPro camera and assess if the identified ob-
stacle type is really in the video.
try to find in the video record if there is any po-
tential obstacle which, although it appears in the
Mobileye field of view, has not been detected and
correctly classified.
3 RESULTS AND DISCUSSION
The following table gives the overall results. Most
of the time (97% of the 2880 observation samples),
the test vehicle shared the road with other road users
(cars, trucks, bikes, bicycles, pedestrians, etc.). As the
tests were carried out in winter, very few pedestrians
were in the streets (see Table 1).
As reported in Table 2, 2702 samples involves an-
other vehicle (without a pedestrian). The vision sys-
tem was able to correctly detect and classify all vehi-
VEHITS 2019 - 5th International Conference on Vehicle Technology and Intelligent Transport Systems
564
Figure 5: Urban test path in a Canada city in Winter. The
red color represents the driving road followed by the test
vehicle.
Table 1: Global indicators.
Number of samples 2880
Number of samples
with another road users 2763
(cars, trucks, pedestrians, etc.)
Number of samples with
at least one pedestrian 61
cle in the video samples, even when the ambient light
was very poor. This good performance of the vision
system indicates that the ADAS system is much ro-
bust than what the manufacturer advertises.
Table 2: Vehicle detection result.
Number of samples with
at least one vehicle (no pedestrian) 2702
Number of samples with a vehicle
correctly detected 2702
Number of samples with a vehicle
incorrectly detected 0
It has also been observed that 2% of the samples
involved a pedestrian. It is critical for safety reason
to know how the intelligent vision system can per-
form in such difficult weather conditions. Therefore,
we investigate to know what proportion of the sam-
ples with a pedestrian has been correctly classified as
having a pedestrian.
Table 3 shows the detail results of pedestrian de-
tection. Among the 61 video samples with at least one
pedestrian, 72% were correctly analyzed and classi-
fied as having a pedestrian. 20% were incorrectly de-
tected. However, for 8%, we can not assess that the
pedestrian was in the Mobileye field of view.
Fig. 6 illustrates a video sample with a pedestrian
who is correctly detected and classified at dusk in
winter. The sky is cloudy and the ambient light is low,
which reduces the contrast between the pedestrian im-
age and the background. This sample has been taken
when the test vehicle (Hyundai Tucson Hydrogen Ve-
Table 3: Pedestrian detection result.
Number of samples with
at least one pedestrian 61
Number of pedestrian samples
correctly detected 44
Number of pedestrian samples
incorrectly detected 12
Number of pedestrian samples
unknown classification 5
hicle), after stopping at a road intersection, stated to
turn on the left. At the same time, a pedestrian was
crossing the intersection and several other vehicles are
waiting at the same intersection. It is important to
note that when the test vehicle is stopped, the pedes-
trian worn dark clothes and the asphalt color of the
road does not offer a good contrast with the back-
ground. In addition, as soon as the vehicle starts turn-
ing the pedestrian start crossing the road.
The detail interpretation of Mobileye messages
that correspond to this video sample is shown in Fig.
7.
Figure 6: Snapshot: Good pedestrian detection at dusk in
winter.
In Fig. 7, we showed two graphs. The time scale
of both graphs are the same and it corresponds to the
sample video ego-time (i.e: the time of the first frame
of the video sample is 0). From 0s to 1.2s, the ve-
hicle is stopped at the intersection: it is waiting to
turn on the left. Therefore, no important and im-
mediate obstacle has been reported by the Mobileye
system. Hence, the first graph (graph (a)) indicates
that the number of obstacles is 0 during that time-
frame. Thus, no obstacle type is available before 1.2
as shown in the second graph of Fig. 7.
Between 1.2s to 2.4s, the vehicle is turning on its
left and the Mobileye system has detected one po-
tential obstacle (y-axis of the first graph shows 1 as
the number of detected obstacles.). Note that in the
video, the other cars are stopped at the road inter-
section, waiting their turn to move away. During the
same time-frame, the second graph of Fig. 7 indicates
Using an Intelligent Vision System for Obstacle Detection in Winter Condition
565
that the detected obstacle is likely to be a pedestrian
(obstacle type 3). According to Mobileye technical
document, the following type of obstacles could be
reported:
Type 0 (binary word 000): the detected obstacle is
most likely to be a vehicle
Type 1 (binary word 001): the detected obstacle is
most likely to be a truck
Type 2 (binary word 010): the detected obstacle is
most likely to be a bike
Type 3 (binary word 011): the detected obstacle is
most likely to be a pedestrian
Type 4 (binary word 100): the detected obstacle is
most likely to be a bicycle
The classified pedestrian is true since we can ob-
serve in the shown snapshot (Fig. 6) that there was
pedestrian crossing the road.
0 0.5 1 1.5 2
Time (s)
0
0.5
1
1.5
2
Number of Obstacles
(a)
0 0.5 1 1.5 2
Time (s)
2
2.5
3
3.5
4
Obstacle Type
(b)
Figure 7: Mobileye message corresponding to the good
pedestrian detection at dusk in winter: (a) The num-
ber of detected obstacles are reported by the system; (b)
The detected obstacle type: 0=Vehicle, 1=Truck, 2=Bike,
3=Pedestrian, 4=Bicycle.
In the snapshot below, the pedestrian on the side-
walk has been successfully detected and classified al-
though the poor contrast between the pedestrian im-
age and the background. It is worth mentioning that
there was a drop of snow in front of the camera rep-
resented by the white spot in the upper left corner of
Fig. 8.
During the tests, several samples with a pedestrian
have not been successfully processed. At this time it
is very difficult to know what is the most likely reason
for these non detection. In the sequel, we illustrated
some of the failed pedestrian detections.
In Figs. 9 and 10, the test vehicle is in a traffic
during the night in winter. There is a truck on its left
Figure 8: Snapshot: Good pedestrian detection during the
night in winter.
Figure 9: Snapshot: A vehicle is correctly detected and clas-
sified but the pedestrian is not detected (see Fig. 10).
and a passenger car in front. In addition, we can ob-
serve a pedestrian facing the road and who is start
crossing the road (Fig.9). The analysis of Mobileye
messages indicates that only one obstacle has been
reported (graph (a) of Fig. 9). The type of reported
obstacle is ”0” which means that the detected obsta-
cle is most likely to be a vehicle (graph (b) of Fig. 9).
Hence the pedestrian is not detected.
In the following figures, we show a case of no
detection when the test vehicle was in urban traffic
and there were several spotlights into the camera field
of view. One car is in front whereas two pedestrians
were walking on the sidewalk.
The next figures show a sample with a non-
detected pedestrian, although the front vehicle is cor-
rectly detected and classified. This sample is taken
during the dusk in winter. Indeed, we can observe in
Fig. 13 a pedestrian crossing the road on the left. The
test vehicle is waiting to turn on the left. The anal-
ysis of Mobileye messages indicates in Fig. 14 that
the number of reported obstacles is 1 (see graph (a))
and the type of the detected obstacle is 3 which means
that the front vehicle is detected. However, the cross-
ing pedestrian, although he was walking on the road,
no specific detection has been reported.
VEHITS 2019 - 5th International Conference on Vehicle Technology and Intelligent Transport Systems
566
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
Time (s)
0
0.5
1
1.5
2
Number
of obstacles
(a)
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
Time (s)
-1
-0.5
0
0.5
1
Obstacle type
(b)
Figure 10: Mobileye signal interpretation: (a) The number
of reported obstacles is 1. (b) The obstacle type is a vehi-
cle. Clearly the pedestrian near the vehicle in front is not
detected as shown in the snapshot (see Fig. 9).
Figure 11: Snapshot: The vehicle in front and the two
pedestrians are not detected (see Fig. 12).
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
Time (s)
-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
Number of obstacle
Figure 12: Mobileye signal interpretation: The number of
reported obstacles is 0.
4 CONCLUSION AND FUTURE
WORKS
In this work, one of the most used vision systems for
advanced driving assistance system (Mobileye) has
been tested in winter condition and urban traffic. Us-
Figure 13: Snapshot: The vehicle in front and the two
pedestrians are not detected (see Fig. 14).
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
Time (s)
0
0.5
1
1.5
2
Number
of obstacles
(a)
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
Times (s)
-1
-0.5
0
0.5
1
Obstacle type
(b)
Figure 14: Mobileye signal interpretation: The number of
reported obstacles is 0.
ing additional cameras, the performance of this ad-
vanced vision system has been assessed in harsh en-
vironmental conditions. The test started during win-
ter dusk and covered different lighting and weather
conditions. In general Mobileye vision system pro-
vides good detection accuracy for cars and other vehi-
cles (more than 99night seems to be more difficult to
achieve. In future work, an in-depth analysis of which
pedestrian characteristics are most likely to have a
significant impact on the detection accuracy in win-
ter will be carried out. Besides, we will investigate
the combination of a thermal vision system with the
Mobileye vision system to increase the pedestrian de-
tection and classification accuracy
ACKNOWLEDGEMENTS
This work was supported by the Natural Sciences and
Engineering Research Council of Canada.
Using an Intelligent Vision System for Obstacle Detection in Winter Condition
567
REFERENCES
Abelson, P. (2012). Playing it safe. Concrete Producer,
30(1).
Amamou, A. A., Kelouwani, S., Boulon, L., and Agbossou,
K. (2016). A comprehensive review of solutions and
strategies for cold start of automotive proton exchange
membrane fuel cells. IEEE Access, 4:4989–5002.
Barclay, S. (2012). Mobileye to launch the world’s first
smartphone-connected driver assistance system. Au-
tomotive Industries AI, 192(1).
Birrell, S. A., Fowkes, M., and Jennings, P. A. (2014). Ef-
fect of using an in-vehicle smart driving aid on real-
world driver performance. IEEE Transactions on In-
telligent Transportation Systems, 15(4):1801–1810.
Cano, M. H., Kelouwani, S., Agbossou, K., and Dub
´
e, Y.
(2014). Free air breathing proton exchange mem-
brane fuel cell: Thermal behavior characterization
near freezing temperature. Journal of Power Sources,
246:650–658.
Chang, B., Tsai, H., and Young, C.-P. (2010). Intelli-
gent data fusion system for predicting vehicle colli-
sion warning using vision/gps sensing. Expert Systems
with Applications, 37(3):2439–2450.
Chang, B., Young, C.-P., and Tsai, H. (2009). Simu-
lation and implementation of high-performance col-
lision warning system for motor vehicle safety us-
ing embedded anfis prediction. International Jour-
nal of Innovative Computing, Information and Con-
trol, 5(10):3415–3430.
Chen, B., Zhao, D., and Peng, H. (2017). Evaluation
of automated vehicles encountering pedestrians at
unsignalized crossings. pages 1679–1685.
Curiel-Ramirez, L., Ramirez-Mendoza, R., Carrera, G.,
Izquierdo-Reyes, J., and Bustamante-Bello, M.
(2018). Towards of a modular framework for semi-
autonomous driving assistance systems. International
Journal on Interactive Design and Manufacturing,
pages 1–10.
Excell, J. (2005). Sens-ational drive. Engineer,
293(7668):30–31.
Fireman, M. (2017). Mobileye to generate, share, and uti-
lize vision data for crowdsourced mapping with nis-
san. Automotive Industries AI, 197(4).
Henao, N., Kelouwani, S., Agbossou, K., and Dub
´
e, Y.
(2012). Pemfc low temperature startup for electric
vehicle. In IECON 2012-38th Annual Conference
on IEEE Industrial Electronics Society, pages 2977–
2982. IEEE.
Ke, R., Lutin, J., Spears, J., and Wang, Y. (2017). A cost-
effective framework for automated vehicle-pedestrian
near-miss detection through onboard monocular vi-
sion. volume 2017-July, pages 898–905.
Larkin, J. (2006). Mobileye vision technologies providing
applications for delphi’s collision mitigation system.
Automotive Industries AI, 186(4).
Lee, M., Lim, W., Kim, S., and Sunwoo, M. (2018). Tem-
poral local route modeling using the recognized lane
for autonomous driving comfort.
Lyu, N., Deng, C., Xie, L., Wu, C., and Duan, Z. (2018). A
field operational test in china: Exploring the effect of
an advanced driver assistance system on driving per-
formance and braking behavior. Transportation Re-
search Part F: Traffic Psychology and Behaviour.
Markwalter, B. (2017). The path to driverless cars [cta
insights]. IEEE Consumer Electronics Magazine,
6(2):125–126.
Mobileye (2019). Mobileye serie 5,
https://www.mobileye.com/fr-fr/produits/mobileye-
serie-5/.
Reagan, I. (2019). Effects of an aftermarket crash avoid-
ance system on warning rates and driver acceptance in
urban and rural environments. Advances in Intelligent
Systems and Computing, 786:776–787.
Thompson, J., Mackenzie, J., Dutschke, J., Baldock, M.,
Raftery, S., and Wall, J. (2018). A trial of retrofitted
advisory collision avoidance technology in govern-
ment fleet vehicles. Accident Analysis and Prevention,
115:34–40.
Vasic, M., Mansolino, D., and Martinoli, A. (2016). A
system implementation and evaluation of a coopera-
tive fusion and tracking algorithm based on a gaus-
sian mixture phd filter. volume 2016-November, pages
4172–4179.
Wang, J.-G., Zhou, L., Song, Z., and Yuan, M. (2017). Real-
time vehicle signal lights recognition with hdr camera.
pages 355–358.
Wright, C. (2016). Mobileye to offer user-generated map-
ping for autonomous driving: New technology will al-
low drivers to provide real-time road information to
enhance autonomous driving experience. volkswagen
and general motors first strategic partners to explore
this technology. Automotive Industries AI, 196(1).
VEHITS 2019 - 5th International Conference on Vehicle Technology and Intelligent Transport Systems
568