REAL-TIME LOCALIZATION OF AN UNMANNED GROUND
VEHICLE USING A 360 DEGREE RANGE SENSOR
Soon-Yong Park and Sung-In Choi
School of Computer Science and Engineering, Kyungpook National University
1370 Sankyuk-dong, Puk-gu, Daegu, 702-701 Korea
Keywords:
Localization, 3D Sensor, Registration, Unmanned vehicle.
Abstract:
A computer vision technique for the localization of an unmanned ground vihicle (UGV) is presented. The pro-
posed technique is based on 3D registration of a sequence of 360 degree range data and a digital surface model
(DSM). 3D registration of a sequence of dense range data requires a large computation time. For real time
localization, we propose projection-based registration and uniform arc length sampling (UALS) techniques.
UALS reduces the number of 3D sample points while maintaining their uniformity over range data in terms
of ground sample distance. The projection-based registration technique reduces the time of 3D correspon-
dence search. Experimental results from two real navigation paths are shown to verify the performance of the
proposed method.
1 INTRODUCTION
Three dimensional (3D) registration is a computer vi-
sion technique to align multi-view range data with re-
spect to a common coordinate system. Many inves-
tigations have been introduced for 3D model recon-
struction, 3D robot vision, etc. Recently in robotics
community, 3D registration is applied to localization
of unmanned robots or vehicles from range data ac-
quired from 3D sensors.
A 3D sensor mounted on an unmanned vehicle
captures the 3D shape around the vehicle, which is
a local 3D map represented with respect to the sensor
coordinate system. If there is a global and reference
3D map which contains the 3D shape information of
navigation environment, the vehicle location can be
determined by matching the local map with respect to
the global map. By the way, an initial position of the
vehicle can be coarsely estimated by a GPS or INS
sensor. Therefore, it only needs to refine the initial
position to correctly match local and global 3D maps.
A common approach of 3D registration is using
the ICP algorithm (Besl and McKay., 1992). R. Mad-
havan et. al. (Madhavan et al., 2005) register a se-
quence of 3D range data in a pair-wise manner to de-
termine a robot pose. A modified ICP algorithm is
employed to cope with matching outliers. Triebel et.
al. (Triebel et al., 2006) introduce multi-level surface
maps to classify surface patches to several object cat-
egories. Levinson et. al. (Levinson et al., 2007) use a
digital map of urban environment. A particle filter is
used to match local range data to the map.
In a few investigations, 360 degree range sensors
are used to capture omnidirectional range data. Him-
melsbach et. al. (Himmelsbach et al., 2008) segment,
classify, and track 3D objects using a 360 degree laser
sensor. They generate occupancy grids from range
data to identify obstacles. K¨ummerle et. al. (Kum-
merle et al., 2009) present an autonomous driving
technique of an unmanned vehicle which is equipped
with multiple navigation sensors including a 360 de-
gree range sensor. In our previous work, a 3D reg-
istration technique is introduced to align 360 degree
range data and a digital surface model (DSM)(Park
and Choi, 2009).
Matching 3D maps obtained from different coor-
dinate systems requires a reasonable number of cor-
respondences between the maps (Hartley and Zisser-
man, 2000). Since a 360 degree range sensor cap-
tures a huge number of 3D points, it requires a sig-
nificant time for 3D registration. For real-time lo-
calization of an unmanned ground vehicle (UGV),
we introduce projection-based registration and uni-
form ground distance sampling techniques to reduce
the number of correspondence. Experimental results
show that the proposed method can find robot position
in about 15Hz rate.
610
Park S. and Choi S..
REAL-TIME LOCALIZATION OF AN UNMANNED GROUND VEHICLE USING A 360 DEGREE RANGE SENSOR.
DOI: 10.5220/0003362406100613
In Proceedings of the International Conference on Computer Vision Theory and Applications (VISAPP-2011), pages 610-613
ISBN: 978-989-8425-47-8
Copyright
c
2011 SCITEPRESS (Science and Technology Publications, Lda.)