stereographic mode. We also used very high resolu-
tion light maps (2048x2048 texels) to improve video
quality. All these graphic improvements may stress
the hardware and rapidly killing the framerate. De-
spite of this, the bar scene used in our test had about
15.000 triangles, high resolution textures and still
ran between around 25 fps in stereographic mode,
with one light source casting soft shadows and
bloom lighting activated. It is important to mention
that our engine is entirely dynamic, so that you can
completely change the scene at run time, but nulli-
fies different optimizations that could be used to
significantly speed up the rendering procedure (con-
trarily to most gaming engines which often pre-
generate an optimized environment but static). Our
engine also features an almost direct use of models
exported from 3D Studio MAX though a proprietary
plug-in, without requiring any special care or con-
version: other CAVE solutions based on videogames
graphic engines, even if faster, put usually more
constraints about that.
5 CONCLUSION
In this paper we exposed the different approaches
we applied during the development of a high quality
stereographic CAVE from the scratch. We showed
how it is possible to build a very good system with-
out requiring professional equipment, thanks to the
high standard of quality of recent common market
level products and a bit of practical sense. Software
side, we created our framework by readapting por-
tions of code from an existing graphic engine, rap-
idly getting a robust, performing and complete solu-
tion in a relatively short time.
Our system has shown its versatility and quality
on real applications and user tests, and also goes in
the direction evoked by Hibbard (Hibbard, 2000)
about three major constraints limiting fruition of VR
immersive devices: our framework offers a rela-
tively cheap solution, fast sensor response and easi-
ness of use on real contexts.
In the next phase we will refine our system and
use it in a wider range of both scientific and busi-
ness-oriented projects, in order to extend the fruition
of this framework to other areas requiring immersive
content visualization.
REFERENCES
Sutcliffe, A., Gault, B., Fernando, T., and Tan, K. 2006.
Investigating interaction in CAVE virtual environ-
ments. ACM Trans. Comput.-Hum. Interact. 13, 2
(Jun. 2006), 235-267
Tyndiuk, F., Thomas, G., Lespinet-Najib, V., and Schlick,
C. 2005. Cognitive comparison of 3D interaction in
front of large vs. small displays. In Proceedings of the
ACM Symposium on Virtual Reality Software and
Technology, Monterey, CA, USA (Nov. 2005).
Buxton, B. and Fitzmaurice, G. W. 1998. HMDs, Caves &
chameleon: a human-centric analysis of interaction in
virtual space. SIGGRAPH Comput. Graph. 32, 4 (Nov.
1998), 69-74.
Czernuszenko, M., Pape, D., Sandin, D., DeFanti, T.,
Dawe, G. L., and Brown, M. D. 1997. The Immer-
saDesk and Infinity Wall projection-based virtual real-
ity displays. SIGGRAPH Comput. Graph. 31, 2 (May.
1997), 46-49.
Cruz-Neira, C., Sandin, D.J., DeFanti, T.A., Kenyon, R.,
and Hart, J.C. 1992. The CAVE, Audio Visual Experi-
ence Automatic Virtual Environment. Communica-
tions of the ACM, June 1992, pp. 64-72.
Sauter, P. M. 2003. VR
2
Go™: a new method for virtual
reality development. SIGGRAPH Comput. Graph. 37,
1 (Feb. 2003), 19-24.
Gross, M., Würmlin, S., Naef, M., Lamboray, E., Spagno,
C., Kunz, A., Koller-Meier, E., Svoboda, T., Van
Gool, L., Lang, S., Strehlke, K., Moere, A. V., and
Staadt, O. 2003. Blue-c: a spatially immersive display
and 3D video portal for telepresence. ACM Trans.
Graph. 22, 3 (Jul. 2003), 819-827.
Jacobson, J. 2003. Using “CAVEUT” to Build Immersive
Displays With the Unreal Tournament Engine and a
PC Cluster, http://citeseer.ist.psu.edu/639123.html
Jacobson, J., Hwang, Z. 2002. Unreal Tournament for
Immersive Interactive Theater. Communications of
the. ACM. 45, 1 (2002), 39-42.
Peternier, A., Thalmann, D., Vexo, F., Mental Vision: a
computer graphics teaching platform, In Lecture Notes
in Computer Science, Springer-Verlag Berlin, 2006
Jae Phil Koo, Sang Chul Ahn, Hyoung-Gon Kim, Ig-Jae
Kim, "Active IR Stereo vision based tracking system
for immersive displays," ACCV 2004, Jan. 2004.
M. Ghazisaedy, D. Adamczyk, D.J. Sandin, R.V. Kenyon,
T.A. DeFanti, "Ultrasonic calibration of a magnetic
tracker in a virtual reality space," vrais, p. 179, Vir-
tual Reality Annual International Symposium
(VRAIS'95), 1995.
Hibbard, B. 2000. Visualization spaces. SIGGRAPH Com-
put. Graph. 34, 4 (Nov. 2000), 8-10.
GRAPP 2007 - International Conference on Computer Graphics Theory and Applications
136