Authors:
Alaa AlZoubi
1
and
David Nam
2
Affiliations:
1
School of Computing, The University of Buckingham, Buckingham and U.K.
;
2
Centre for Electronic Warfare, Information and Cyber, Cranfield University, Defence Academy of the UK and U.K.
Keyword(s):
Vehicle Activity Recognition, Qualitative Trajectory Calculus, Trajectory Texture, Transfer Learning, Deep Convolutional Neural Networks.
Related
Ontology
Subjects/Areas/Topics:
Computer Vision, Visualization and Computer Graphics
;
Motion, Tracking and Stereo Vision
;
Video Surveillance and Event Detection
Abstract:
The automated analysis of interacting objects or vehicles has many uses, including autonomous driving and security surveillance. In this paper we present a novel method for vehicle activity recognition using Deep Convolutional Neural Network (DCNN). We use Qualitative Trajectory Calculus (QTC) to represent the relative motion between pair of vehicles, and encode their interactions as a trajectory of QTC states. We then use one-hot vectors to map the trajectory into 2D matrix which conserves the essential position information of each QTC state in the sequence. Specifically, we project QTC sequences into a two dimensional image texture, and subsequently our method adapt layers trained on the ImageNet dataset and transfer this knowledge to the activity recognition task. We have evaluated our method using two different datasets, and shown that it out-performs state-of-the-art methods, achieving an error rate of no more than 1.16%. Our motivation originates from an interest in automated a
nalysis of vehicle movement for the collision avoidance application, and we present a dataset of vehicle-obstacle interaction, collected from simulator-based experiments.
(More)