4 CONCLUSION
Calibration of the system parameters is essential for
optimal performance of the localization algorithms.
We have showed a calibration procedure that can be
used to determine static transformation between the
camera and PTU coordinate frame. The procedure re-
quires only observation of the same set of points from
three different configurations of the PTU. This cali-
bration procedure is simple to deploy on a real set-
ting, without any special preparation of the environ-
ment solely for the calibration purposes.
In the development of the localization algorithm
system uncertainties were taken into account. Intro-
ducing the flat ground surface constraint, the consid-
ered localization problem can be solved in the two
dimensional plane where the camera measures dis-
tances and angles to the visible landmarks. Assum-
ing normally distributed noise in image measurement
of landmark points, the standard deviation increases
predominantly in the direction of the image ray, pro-
portionally to the squared distance from the camera.
We have shown an approach that uses multiple
partial Kalman filters, where each filter estimates only
the distance and heading to the particular landmark,
and the outputs of these estimators are later used to
calculate the robot pose. Results demonstrate that
this is a feasible approach that converges to the true
pose of the mobile robot. The benefit of this ap-
proach is computational efficiency, since the covari-
ance matrices are low dimensional. To reduce mem-
ory consumption during large-area localization, the
landmarks that have not been observed for a long time
can be made forgotten. The presented models are also
valid when the camera is moving. The proposed sys-
tem can be augmented with a control for tracking of
the nearest visible landmarks to reduce the time when
no landmarks are in the stereo camera field of view.
ACKNOWLEDGEMENTS
The authors acknowledge the financial support from
the Slovenian Research Agency (research core fund-
ing No. P2-0219).
REFERENCES
Agrawal, M. and Konolige, K. (2006). Real-time localiza-
tion in outdoor environments using stereo vision and
inexpensive GPS. In 18th Int. Conf. on Pattern Recog-
nition, volume 3, pages 1063–1068.
Betke, M. and Gurvits, L. (1997). Mobile robot localization
using landmarks. IEEE transactions on robotics and
automation, 13(2):251–263.
Bouguet, J.-Y. (2004). Camera calibration toolbox for Mat-
lab. [online] Available at: http://www.vision.caltech.
edu/bouguetj/calib doc [Accessed April 2019].
Chen, S. Y. (2012). Kalman filter for robot vision: a survey.
IEEE Trans. on Industrial Electronics, 59(11):4409–
4420.
Dellaert, F., Burgard, W., Fox, D., and Thrun, S. (1999).
Using the CONDENSATION algorithm for robust,
vision-based mobile robot localization. In IEEE Com-
puter Society Conf. on Computer Vision and Pattern
Recognition, volume 2, pages 588–594.
Du, X. and Tan, K. K. (2016). Comprehensive and
practical vision system for self-driving vehicle lane-
level localization. IEEE Trans. on Image Processing,
25(5):2075–2088.
Eggert, D. W., Lorusso, A., and Fisher, R. B. (1997). Esti-
mating 3-D rigid body transformations: a comparison
of four major algorithms. Machine vision and appli-
cations, 9(5–6):272–290.
Fischer, T., Pire, T.,
`
E
´
ı
ˇ
zek, P., Crist
´
oforis, P. D., and Faigl,
J. (2016). Stereo vision-based localization for hexa-
pod walking robots operating in rough terrains. In
IEEE/RSJ Int. Conf. on Intelligent Robots and Sys-
tems, pages 2492–2497.
Fuentes-Pacheco, J., Ruiz-Ascencio, J., and Rend
´
on-
Mancha, J. M. (2015). Visual simultaneous localiza-
tion and mapping: a survey. Artificial Intelligence Re-
view, 43(1):55–81.
Garrido-Jurado, S., Munoz-Salinas, R., Madrid-Cuevas,
F. J., and Medina-Carnicer, R. (2016). Generation of
fiducial marker dictionaries using mixed integer linear
programming. Pattern Recognition, 51:481–491.
Hermann, R. and Krener, A. (1977). Nonlinear controlla-
bility and observability. IEEE Trans. on Automatic
Control, 22(5):728–740.
Kim, H., Liu, B., Goh, C. Y., Lee, S., and Myung, H. (2017).
Robust vehicle localization using entropy-weighted
particle filter-based data fusion of vertical and road in-
tensity information for a large scale urban area. IEEE
Robotics and Automation Letters, 2(3):1518–1524.
Konolige, K. and Agrawal, M. (2008). FrameSLAM: From
bundle adjustment to real-time visual mapping. IEEE
Trans. on Robotics, 24(5):1066–1077.
Mei, C., Sibley, G., Cummins, M., Newman, P., and Reid,
I. (2011). RSLAM: A system for large-scale mapping
in constant-time using stereo. International journal of
computer vision, 94(2):198–214.
Piasco, N., Marzat, J., and Sanfourche, M. (2016). Col-
laborative localization and formation flying using dis-
tributed stereo-vision. In IEEE Int. Conf. on Robotics
and Automation, pages 1202–1207.
Se, S., Lowe, D., and Little, J. (2001). Vision-based mobile
robot localization and mapping using scale-invariant
features. In IEEE Int. Conf. on Robotics and Automa-
tion, volume 2, pages 2051–2058.
Tesli
´
c, L.,
ˇ
Skrjanc, I., and Klan
ˇ
car, G. (2011). EKF-based
localization of a wheeled mobile robot in structured
environments. Journal of Intelligent and Robotic Sys-
tems, 62:187–203.
Vision-based Localization of a Wheeled Mobile Robot with a Stereo Camera on a Pan-tilt Unit
551