Authors:
Duo Chen
;
Jie Feng
and
Bingfeng Zhou
Affiliation:
Institute of Computer Science and Technology, Peking University, Beijing and China
Keyword(s):
Novel View Synthesis, Depth Map, Importance Sampling, Image Projection.
Related
Ontology
Subjects/Areas/Topics:
Computational Photography
;
Computer Vision, Visualization and Computer Graphics
;
Geometry and Modeling
;
Image-Based Rendering
;
Rendering
;
Scene and Object Modeling
Abstract:
In this paper, we present a new method for synthesizing images of a 3D scene at novel viewpoints, based on a set of reference images taken in a casual manner. With such an image set as input, our method first reconstruct a sparse 3D point cloud of the scene, and then it is projected to each reference image to get a set of depth points. Afterwards, an improved error-diffusion sampling method is utilized to generate a sampling point set in each reference image, which includes the depth points and preserves the image features well. Therefore the image can be triangulated on the basis of the sampling point set. Then, we propose a distance metric based on Euclidean distance, color similarity and boundary distribution to propagate depth information from the depth points to the rest of sampling points, and hence a dense depth map can be generated by interpolation in the triangle mesh. Given a desired viewpoint, several closest reference viewpoints are selected, and their colored depth maps
are projected to the novel view. Finally, multiple projected images are merged to fill the holes caused by occusion, and result in a complete novel view. Experimental results demonstrate that our method can achieve high quality results for outdoor scenes that contain challenging objects.
(More)