optimize their values alongside the pose param-
eters. This added degree of freedom would allow
to adapt dynamically the distribution in time of
the CP avoiding the generation of new ones.
• We will try to integrate our model in a realtime
SLAM.
7 CONCLUSION
In this paper, we have enriched the B-splines trajec-
tory model from (Steven Lovegrove, 2013) to inte-
grate NUD for the CP in the RS Camera SLAM con-
text. We have shown that this NUD allowed to bet-
ter model trajectories than using an UD for the same
number of CP. We have proposed two methods to
generate the CP, using either IMU measurements or
iterative reprojection error minimization. The two
methods have been tested on synthetic trajectories and
proven to be efficient for the PnP problem resolution.
This enriched model will be integrated in a SLAM in
future works.
REFERENCES
Davison, A. J. (2003). Real-time simultaneous locali-
sation and mapping with a single camera. In 9th
IEEE International Conference on Computer Vision
(ICCV 2003), 14-17 October 2003, Nice, France,
pages 1403–1410.
Engel, J., Sch
¨
ops, T., and Cremers, D. (2014). LSD-SLAM:
Large-scale direct monocular SLAM. In ECCV.
Furgale, P., Barfoot, T. D., and Sibley, G. (2012).
Continuous-time batch estimation using temporal ba-
sis functions. In Robotics and Automation (ICRA),
2012 IEEE International Conference on, pages 2088–
2095.
Gonzalez, A. (2013). Localisation par vision multi-
spectrale. Application aux syst
`
emes embarqu
´
es. The-
ses, INSA de Toulouse.
Hartley, R. I. and Zisserman, A. (2004). Multiple View Ge-
ometry in Computer Vision. Cambridge University
Press, ISBN: 0521540518, second edition.
Hedborg, J., Forss
´
en, P. E., Felsberg, M., and Ringaby, E.
(2012). Rolling shutter bundle adjustment. In Com-
puter Vision and Pattern Recognition (CVPR), 2012
IEEE Conference on, pages 1434–1441.
Hedborg, J., Ringaby, E., Forss
´
en, P. E., and Felsberg, M.
(2011). Structure and motion estimation from rolling
shutter video. In Computer Vision Workshops (ICCV
Workshops), 2011 IEEE International Conference on,
pages 17–23.
Klein, G. and Murray, D. (2007). Parallel tracking and
mapping for small AR workspaces. In Proc. Sixth
IEEE and ACM International Symposium on Mixed
and Augmented Reality (ISMAR’07), Nara, Japan.
Klein, G. and Murray, D. (2009). Parallel tracking and map-
ping on a camera phone. In Proc. Eigth IEEE and
ACM International Symposium on Mixed and Aug-
mented Reality (ISMAR’09), Orlando.
Li, M., Kim, B., and Mourikis, A. I. (2013). Real-time mo-
tion estimation on a cellphone using inertial sensing
and a rolling-shutter camera. In Proceedings of the
IEEE International Conference on Robotics and Au-
tomation, pages 4697–4704, Karlsruhe, Germany.
Mouragnon, E., Lhuillier, M., Dhome, M., Dekeyser, F.,
and Sayd, P. (2006). Real time localization and 3d
reconstruction. In 2006 IEEE Computer Society Con-
ference on Computer Vision and Pattern Recognition
(CVPR’06), volume 1, pages 363–370.
Mur-Artal, R., Montiel, J. M. M., and Tard
´
os, J. D. (2015).
Orb-slam: A versatile and accurate monocular slam
system. IEEE Transactions on Robotics, 31(5):1147–
1163.
Patron-Perez, A., Lovegrove, S., and Sibley, G. (2015). A
spline-based trajectory representation for sensor fu-
sion and rolling shutter cameras. Int. J. Comput. Vi-
sion, 113(3):208–219.
Roussillon, C., Gonzalez, A., Sol
`
a, J., Codol, J., Mansard,
N., Lacroix, S., and Devy, M. (2012). RT-SLAM: A
generic and real-time visual SLAM implementation.
CoRR, abs/1201.5450.
Steven Lovegrove, Alonso Patron-Perez, G. S. (2013).
Spline fusion: A continuous-time representation for
visual-inertial fusion with application to rolling shut-
ter cameras. In Proceedings of the British Machine
Vision Conference. BMVA Press.
Strasdat, H., Montiel, J., and Davison, A. J. (2010). Real-
time monocular slam: Why filter? In Robotics and
Automation (ICRA), 2010 IEEE International Confer-
ence on, pages 2657–2664. IEEE.
Pose Interpolation for Rolling Shutter Cameras using Non Uniformly Time-Sampled B-splines
293