To be used as an authentic trace, optical flow values
must be protected against modification since their cre-
ation and must be verified with time values to have a
trace when events have occured. To do this, we pro-
pose to timestamp optical flow matrix saved in the
blackbox.
This paper is organized as follow: In section 2, we
present a litterature review of mobile objects detec-
tion techniques. In section 3, we present optical flow
basics and timestamping in vehicular networks. In
section 4, we expose our approach for optical flow
timestamping, we illustrate our proposal with a prac-
tical use case in section 5 and we conclude our paper
in section 6.
2 LITERATURE REVIEW
There are three methods for mobile objects tracking
which are frame difference, background subtraction
and optical flow. Frame difference method is based
on the difference between pixels to find the moving
object. There is a frame which is taken as a refer-
ence. The difference between the reference frame and
the current frame is calculated. The moving object is
detected from the found difference. In (Singla, 2014),
the video is captured by a static camera, the difference
between the two consecutive frames is calculated, and
then it is transformed to Gray image and filtered us-
ing Gauss low pass filter and binarized at the end. Ex-
perimental results show motion in the binary image
found by calculating the difference between the two
frames. In (Joshi, 2014), an algorithm is developed
to calculate the speed of moving vehicles and detect
those which violate the speed limits. Video is cap-
tured by a static camera, tracking of the moving object
and calculation of the velocity of the object is done
using segmentation in a first step, which separates re-
gions of the image. Segmentation is done using frame
difference algorithm.
The inconvenient of the temporal differencing method
is that it cannot detect slow changes, for this reason,
background subtraction method is used. This method
is based on extracting background which is the region
of the image without motion. The absolute differ-
ence between the background model and each instan-
taneous frame is used to detect the moving object.
(N.Arun Prasath, 2015) uses the background subtrac-
tion as a first step to estimate vehicle speed. After
conversion of the video into frames, the background
is extracted to detect moving vehicles. After that, fea-
ture extraction and vehicle tracking are done to de-
termine the speed. An algorithm that detects moving
objects in high secured environments is proposed in
(Singh et al., 2014). Detection is done online and of-
fline. In Offline detection, the video is divided into
frames. Moving object is detected by separating the
foreground from a static background. When the mov-
ing object is known, it is marked by a rectangular box.
When the object moves, the alarm is activated. The
position of the centroid of the object is calculated, and
the distance and the velocity of the object are deter-
mined. A system for monitoring traffic rules violation
is presented in (Gupta, 2015) to monitor the velocity
limits violation and detect the registered license plate
number. To monitor velocity limits violation, veloc-
ity is calculated. The first step of velocity calcula-
tion is vehicle tracking which is done by background
subtraction. Each frame is subtracted from the back-
ground model. The blobs that are found as result of
subtraction correspond to moving objects.
Background subtraction is a widely used approach for
detecting moving objects from static cameras.
Optical flow technique is used for moving objects
tracking. It provides an apparent change of mov-
ing object location between two frames. It insulates
the moving objects from the static background ob-
jects. Optical flow estimation is represented by a two-
dimensional vector assigned to each pixen of the im-
age and represents velocities of each point of an im-
age sequence.
Optical flow was used in many works for mobile on-
jects tracking; (Garcia-Dopico et al., 2014) presents
a system for the search and detection of moving ob-
jects in a sequence of images captured by a camera
installed in a vehicle. The proposed system is based
on optical flow analysis to detect and identify moving
objects as perceived by a driver. The proposed method
consists of three stages. In the first stage, the optical
flow is calculated for each image of the sequence, as
a first estimate of the apparent motion. In the second
stage, two segmentation processes are addressed: the
optical flow itself and the images of the sequence. In
the last stage, the results of these two segmentation
processes are combined to obtain the movement of
the objects present in the sequence, identifying their
direction and magnitude.
(Indu et al., 2011) proposes a method to estimate vehi-
cle speed from video sequences acquired with a fixed
mounted camera. The vehicle motion is detected and
tracked along the frames using optical flow algorithm.
The distance traveled by the vehicle is calculated us-
ing the movement of the centroid over the frames and
the speed of the vehicle is estimated.
In many works, optical flow was combined with other
techniques. In (Guo-Wu Yuan, 2014), Optical flow
was combined with frame difference method. In this
work, optical flow for Harris corners is calculated, and
WINSYS 2019 - 16th International Conference on Wireless Networks and Mobile Systems
382