A Combined Calibration of 2D and 3D Sensors
A Novel Calibration for Laser Triangulation Sensors based on Point
Correspondences
Alexander Walch and Christian Eitzinger
Machine Vision Department, Profactor GmbH, Steyr-Gleink, Austria
Keywords:
Laser Triangulation Sensor, Calibration of Laser Stripe Profiler, Colored Point Cloud, Camera Calibration.
Abstract:
In this paper we describe a 2D/3D vision sensor, which consists of a laser triangulation sensor and matrix
colour camera. The outcome of this sensor is the fusion of the 3D data delivered from the laser triangulation
sensor and the colour information of the matrix camera in the form of a coloured point cloud. For this reason
a novel calibration method for the laser triangulation sensor was developed, which makes it possible to use
one common calibration object for both cameras and provides their relative spatial position. A sensor system
with a SICK Ranger E55 profile scanner and a DALSA Genie color camera was set up to test the calibration
in terms of the quality of the match between the color information and the 3D point cloud.
1 INTRODUCTION
Laser triangulation sensors are a widespread instru-
ment to gain 3D information in machine vision appli-
cations. They consist of a matrix camera and a laser
stripe projector.
The matrix camera is directed onto the plane de-
fined by the laser and is parameterized in a way that
only data from points intersecting with this laser plane
are captured.
To gain more 3D information than only one pro-
file, laser triangulation sensors are often used to scan
objects moved by a conveyor belt or they are mounted
on a linear axis.
However, for many inspection tasks it is necessary
to obtain information about their textures, in addition
to the 3D shape of the objects.
On the one hand such a system is presented in
(Munaro et al., 2011), where a colour camera is used
for both tasks and additionally two lasers were used
to reduce occlusion. For that reason one area in the
middle of the image sensor of the camera was used
to obtain colour information and the surrounding ar-
eas of the sensor serve as laser triangulation sensor,
together with the two lasers.
On the other hand there already exist different
methods to calibrate pure laser triangulation sensors.
Some of them, such as the calibration method pro-
vided by the manufacturer of the Ranger E55, only
describe a mapping between the image plane and the
laser plane of triangulation sensor. This methods lack
of information about the spatial position of the sensor.
In case of a linear motion of the sensor, it requires to
mount the laser in an orthogonal position to the di-
rection of motion. In (McIvor, 2002) a calibration is
presented, which uses a 3D calibration object and the
used mathematical model fully describes the laser tri-
angulation sensor including its extrinsic parameters.
In this paper we describe in section 2.3 a cali-
bration which only uses data received from the laser
plane and does not use the laser triangulation sensors
camera as a matrix camera as in (Munaro et al., 2011)
and (Bolles et al., 1981). Hence it is also applicable
in camera setups which use bandpass filters to block
out the surrounding light.
All necessary data for the calibration are obtained
from one single scan of the calibration object. This
makes the calibration process more efficient, espe-
cially because additionally to the laser triangulation
sensor, we also calibrate the second camera, which
provides colour images.
On the contrary to the algorithm, which is de-
scribed in (McIvor, 2002) the distance the objects are
moved between two captured profiles does not need to
be known, but is a parameter of the calibration, which
is determined.
Furthermore the novel calibration is easy to imple-
ment because either the direct-linear-transformation-
algorithm (Abdel-Aziz and Karara, 1971) is used to
determine the camera parameters or closed form solu-
89
Walch A. and Eitzinger C..
A Combined Calibration of 2D and 3D Sensors - A Novel Calibration for Laser Triangulation Sensors based on Point Correspondences.
DOI: 10.5220/0004682900890095
In Proceedings of the 9th International Conference on Computer Vision Theory and Applications (VISAPP-2014), pages 89-95
ISBN: 978-989-758-003-1
Copyright
c
2014 SCITEPRESS (Science and Technology Publications, Lda.)