marker and a calibration pattern in the robot visual
servoing applications.
Within the static 2D test, we were moving the
IR-LED array with the linear drive perpendicular to
the camera optical axes and measured the increments
in the image. The purpose was to detect the smallest
linear response in the image. The IR-LED centroids
were determined in two ways: on binary images and
on grey-level images as centers of mass. During
image grabbing the array did not move thus
eliminating any dynamic effects. We averaged the
movement of centroids of 10 IR-LEDs in a sequence
of 16 images and calculated the standard deviation
to obtain accuracy confidence intervals. With the
dynamic 2D test shape distorsions in the images due
to fast 2D movements of linear drive were
investigated. We compared a few images of IR-LED
array taken during movement to statically obtained
ones which provided information of photocenter
displacements and an estimation of dynamic error.
We performed the 3D accuracy evaluation with 2
fully calibrated cameras in a stereo setup. Using
again the linear drive, the array of IR-LEDs was
moved along the line in 3D space with different
increments and the smallest movement producing a
linear response in reconstructed 3D space was
sought. In the 3D dynamic test, we attached the IR-
LED array to the wrist of an industrial robot, and
dynamically guided it through some predefined
points in space and simultaneously recorded the
trajectory with fully calibrated stereo cameras. We
compared the reconstructed 3D points from images
to the predefined points fed to robot controller.
3 TESTING SETUP
The test environment consisted of:
PhotonFocus MV-D1024-80-CL-8 camera with
CMOS sensor and framerate of 75 fps at full
resolution (1024x1024 pixels),
Active Silicon Phoenix-DIG48 PCI frame
grabber,
Moving object (IR-LED array) at approximate
distance of 2m. The IR-LED array (standard
deviation of IR-LED accuracy is below 0.007
pixel, as stated in (Papa and Torkar, 2006))
fixed to Festo linear guide (DGE-25-550-SP)
with repetition accuracy of +/-0.02mm.
For then static 2D test the distance from camera
to a moving object (in the middle position) that
moves perpendicularly to optical axis was 195cm;
camera field-of-view was 220cm, which gives pixel
size of 2.148mm; Schneider-Kreuznach lens
CINEGON 10mm/1,9F with IR filter; exposure time
was 10.73ms, while frame time was 24.04ms, both
obtained experimentally.
For the dynamic 2D test conditions were the
same as in static test, except the linear guide was
moving the IR-LED array with a speed of 460mm/s
and the exposure time was 1ms.
In the 3D reconstruction test the left camera
distance to IR-LED array and right camera distance
to IR-LED array were about 205cm; baseline
distance was 123cm; Schneider-Kreuznach lens
CINEGON 10mm/1,9F with IR filter; Calibration
region-of-interest (ROI): 342 x 333 pixels;
Calibration pattern: 6 x 8 black/white squares;
Calibration method (Zhang, 1998); Reconstruction
method (Faugeras, 1992). The reconstruction was
done off-line and the stereo correspondence problem
was considered solved due to a simple geometry of
the IR-LED array and is thus not addressed here.
For the 3D dynamic test, an ABB industrial robot
IRB 140 was used with the standalone fully
calibrated stereo vision setup placed about 2m away
from its base and calibrated the same way as before.
The robot wrist was moving through the corners of
an imaginary triangle with side length of
approximately 12cm. The images were taken
dynamically when the TCP was passing the corner
points and reconstructed in 3D with an approximate
speed of 500mm/s. The relative length of such
triangle sides were compared to the sides of a
statically-obtained and reconstructed triangle. The
robot native repeatability is 0.02 mm and its
accuracy is 0.01mm.
4 RESULTS
4.1 2D Accuracy Tests
The results of the evaluation tests are given below.
Tests include the binary and grey-level centroids.
For each movement increment the two figures are
presented, as described below.
Pixel difference between the starting image and
the consecutive images (at consecutive positions) –
for each position the value is calculated as the
average displacement of all 10 markers, while their
position is calculated as the average position in the
sequence of the 16 images grabbed at each position
in static conditions. The lines in these figures should
be as straight as possible.
The 0.01mm, 0.1mm, and 1mm increments for
2D tests are presented in Figure 1.
ROBOT TCP POSITIONING WITH VISION - Accuracy Estimation of a Robot Visual Control System
213