REFERENCES
Alismail, H. S., Baker, L. D., and Browning, B. (2012). Au-
tomatic calibration of a range sensor and camera sys-
tem. In 2012 Second Joint 3DIM/3DPVT Conference:
3D Imaging, Modeling, Processing, Visualization &
Transmission (3DIMPVT 2012), Pittsburgh, PA. IEEE
Computer Society.
Fischler, M. and Bolles, R. (1981). RANdom SAmpling
Consensus: a paradigm for model fitting with appli-
cation to image analysis and automated cartography.
Commun. Assoc. Comp. Mach., 24:358–367.
Frohlich, R., Kato, Z., Tr
´
emeau, A., Tamas, L., Shabo,
S., and Waksman, Y. (2016). Region based fusion
of 3d and 2d visual data for cultural heritage objects.
In 23rd International Conference on Pattern Recog-
nition, ICPR 2016, Canc
´
un, Mexico, December 4-8,
2016, pages 2404–2409.
Geiger, A., Moosmann, F., Car, O., and Schuster, B. (2012).
Automatic camera and range sensor calibration us-
ing a single shot. In IEEE International Conference
on Robotics and Automation, ICRA 2012, 14-18 May,
2012, St. Paul, Minnesota, USA, pages 3936–3943.
Gong, X., Lin, Y., and Liu, J. (2013). 3d lidar-camera ex-
trinsic calibration using an arbitrary trihedron. Sen-
sors, 13(2).
Hartley, R. I. and Zisserman, A. (2003). Multiple View Ge-
ometry in Computer Vision. Cambridge University
Press.
Lebeda, K., Matas, J., and Chum, O. (2012). Fixing the lo-
cally optimized RANSAC. In British Machine Vision
Conference, BMVC 2012, Surrey, UK, September 3-7,
2012, pages 1–11.
Malis, E. and Vargas, M. (2007). Deeper understanding of
the homography decomposition for vision-based con-
trol. Research report.
Pandey, G., McBride, J., Savarese, S., and Eustice, R.
(2010). Extrinsic calibration of a 3d laser scanner and
an omnidirectional camera. In 7th IFAC Symposium
on Intelligent Autonomous Vehicles, volume 7, Leece,
Italy.
Pandey, G., McBride, J. R., Savarese, S., and Eustice, R. M.
(2012). Automatic targetless extrinsic calibration of a
3d lidar and camera by maximizing mutual informa-
tion. In Proceedings of the AAAI National Conference
on Artificial Intelligence, pages 2053–2059, Toronto,
Canada.
Park, Y., Yun, S., Won, C. S., Cho, K., Um, K., and Sim, S.
(2014). Calibration between color camera and 3d lidar
instruments with a polygonal planar board. Sensors,
14(3):5333–5353.
Pusztai, Z., Eichhardt, I., and Hajder, L. (2018). Accurate
calibration of multi-lidar-multi-camera systems. Sen-
sors, 18(7):2139.
Rodriguez F, S., Fremont, V., and Bonnifait, P. Extrinsic
calibration between a multi-layer lidar and a camera.
T
´
oth, T., Pusztai, Z., and Hajder, L. (2020). Automatic
lidar-camera calibration of extrinsic parameters using
a spherical target. In 2020 IEEE International Con-
ference on Robotics and Automation (ICRA), pages
8580–8586.
Ve
´
las, M.,
ˇ
Span
ˇ
el, M., Materna, Z., and Herout, A.
(2014). Calibration of rgb camera with velodyne lidar.
In WSCG 2014 Communication Papers Proceedings,
volume 2014, pages 135–144. Union Agency.
Zhang, Q. and Pless, R. (2004). Extrinsic calibration of a
camera and laser range finder (improves camera cali-
bration). In 2004 IEEE/RSJ International Conference
on Intelligent Robots and Systems, Sendai, Japan,
September 28 - October 2, 2004, pages 2301–2306.
Zhang, Z. (2000). A flexible new technique for camera cal-
ibration. IEEE Transactions on Pattern Analysis and
Machine Intelligence, 22(11):1330–1334.
Zhou, L., Li, Z., and Kaess, M. (2018). Automatic extrinsic
calibration of a camera and a 3d lidar using line and
plane correspondences. In 2018 IEEE/RSJ Interna-
tional Conference on Intelligent Robots and Systems,
IROS 2018, Madrid, Spain, October 1-5, 2018, pages
5562–5569.
APPENDIX
Solution of argmin
Y
k
FG − H
k
2
Subject to
k
g
k
2
= 1.
The objective is to show how the equation Fg = h can
be optimally solved, in the least squares sense, sub-
ject to g
T
g = 1. Cost function J can be written using
Lagrangian multiplier λ as follows:
J = (Fg − h)
T
(Fg − h) + λg
T
g.
The optimal solution is given by the derivative of J
w.r.t. vector g as
∂J
∂g
= 2F
T
(Fg − h) + 2λg = 0.
Therefore the optimal solution is g =
F
T
F + λI
−1
F
T
h. For the sake of simplicity,
let us denote vector F
T
h by r and the symmetric
matrix F
T
F by L. Then g = (L + λI)
−1
r. Finally,
constraint g
T
g = 1 has to be considered as
r
T
(L + λI)
−T
(L + λI)
−1
r = 1. (9)
The inverse matrix can be written as
(L + λI)
−1
=
adj(L +λI)
det(L +λI)
,
where adj(L + λI) and det(L +λI) denote the adjoint
matrix
3
and the determinant of matrix L +λI, respec-
tively. This can be substituted into Eq. 9 as follows:
r
T
adj
T
(L + λI)adj(L + λI) r = det
2
(L + λI).
3
Adjoint matrix is also called as the matrix of cofactors.
LiDAR-camera Calibration in an Uniaxial 1-DoF Sensor System
737