Authors:
Tomoya Ishikawa
;
Kazumasa Yamazawa
and
Naokazu Yokoya
Affiliation:
Graduate School of Information Science, Nara Institute of Science and Technology, Japan
Keyword(s):
Telepresence, Novel view generation, Multi-cast, Network, Image-based rendering.
Related
Ontology
Subjects/Areas/Topics:
Enterprise Information Systems
;
Human-Computer Interaction
;
Informatics in Control, Automation and Robotics
;
Multimedia Systems
;
Robotics and Automation
;
Virtual Environment, Virtual and Augmented Reality
;
Virtual Reality and Augmented Reality
Abstract:
The advent of high-speed network and high performance PCs has prompted research on networked telepresence, which allows a user to see virtualized real scenes in remote places. View-dependent representation, which provides a user with arbitrary view images using an HMD or an immersive display, is especially effective in creating a rich telepresence. The goal of our work is to realize a networked novel view telepresence system which enables multiple users to control the viewpoint and view-direction independently by virtualizing real dynamic environments. In this paper, we describe a novel view generation method from multiple omni-directional images captured at different positions. We mainly describe our prototype system with high-scalability which enables multiple users to use the system simultaneously and some experiments with the system. The novel view telepresence system constructs a virtualized environment from real live videos. The live videos are transferred to multiple users by
using multi-cast protocol without increasing network traffic. The system synthesizes a view image for each user with a varying viewpoint and view-direction measured by a magnetic sensor attached to an HMD and presents the generated view on the HMD. Our system can generate the user’s view image in real-time by giving correspondences among omni-directional images and estimating camera intrinsic and extrinsic parameters in advance.
(More)