SIMULTANEOUS ESTIMATION OF LIGHT SOURCES POSITIONS
AND CAMERA ROTATION
Masahiro Oida, Fumihiko Sakaue and Jun Sato
Nagoya Institute of Technology, Gokiso, Showa, Nagoya 466-8555, Japan
Keywords:
Mixed reality, Photometric calibration, Geometric calibration.
Abstract:
For mixed reality and other applications, it is very important to achieve photometric and geometric consistency
in image synthesis. This paper describes a method for calibrating camera and light source simultaneously from
photometric and geometric constraints. In general, feature points in a scene are used for computing camera
positions and orientations. On the other hand, if the cameras and objects are sticked and move together, the
changes in shading information of the objects in images also include useful information on geometric camera
motions. In this paper, we show that if we use both shading information and feature point information, we can
calibrate cameras from smaller number of feature points than the existing methods. Furthermore, it is shown
that the proposed method can calibrate light sources as well as cameras. The accuracy of the proposed method
is evaluated by using real and synthetic images.
1 INTRODUCTION
Recently, mixed reality systems are studied exten-
sively(Milgram and Kishino, 1994). In these systems,
real images are taken by cameras and virtual informa-
tion is added to images. In order to achieve realistic
mixed reality, it is important to obtain scene environ-
ment such as lighting information and geometric in-
formation such as camera parameters.
In general, geometric information of cameras is
obtained from coordinates of feature points in images
by using camera calibration techniques(Hartley and
Zisserman, 2000; Faugeras and Luong, 2001). On the
other hand, lighting information is obtained from im-
ages of specular sphere(Powell et al., 2001), Lamber-
tian sphere(Takai et al., 2003) and so on(Sato et al.,
1999b; Sato et al., 1999a). In these existing meth-
ods, geometric information and photometric informa-
tion were obtained separately in different way. How-
ever, these informations are actually closely related
to each other. For example, if an object is fixed to the
camera, the illumination of the object changes accord-
ing to camera motions, and the change in intensity of
the object in the camera image provides us very use-
ful information for estimating camera motions. Also,
if a light source is attached on a camera and moves
together with the camera, the illumination of a static
object changes according to the camera motions, and
the change in intensity of the scene provides us useful
information for estimating camera motions.
In this paper, we propose a method which enables
us to calibrate lighting information and geometric in-
formation simultaneously by combining photometric
and geometric information. In this method, lighting
information is obtained from observation of a refer-
ence object, and camera parameters are computed by
combining photometric and geometric information,
such as shading information and image coordinates of
feature points in images. Since there are many light
sources in general real scenes and their distributions
vary, we consider a geodesic dome around the 3D
scene and light sources are distributed on the geodesic
sphere. Then, the distribution of light sources is esti-
mated and used for recovering camera motions from
photometric and geometric informations. By using
both photometric and geometric informations, cam-
era calibration can be achieved from smaller number
of feature points.
2 ESTIMATION OF LIGHT
SOURCE POSITIONS
We first consider the estimation of light source posi-
tions from images. In this paper, we consider a scene,
where a known object with Lambertian surface exists
with other unknown objects, and they are illuminated
168
Oida M., Sakaue F. and Sato J..
SIMULTANEOUS ESTIMATION OF LIGHT SOURCES POSITIONS AND CAMERA ROTATION.
DOI: 10.5220/0003374901680174
In Proceedings of the International Conference on Computer Vision Theory and Applications (VISAPP-2011), pages 168-174
ISBN: 978-989-8425-47-8
Copyright
c
2011 SCITEPRESS (Science and Technology Publications, Lda.)