mouse is not possible, as the VR headsets cover the
eyes of the users. One solution for navigation is the
teleport navigation, which additionally helps to
avoid simulation sickness. Tracked controllers
simplify the investigation of virtual objects like the
insect. In 2D applications, a spectator usually needs
to learn complicated combinations of mouse
movements and keyboard buttons to select, pan,
rotate and scale objects to be able to investigate it
from all possible viewing directions. With tracked
controllers, only the trigger is necessary to grab and
select an object, afterwards it can be turned and
moved around by moving the hand like in real life.
This kind of interaction ability may increase the
usability and user-friendliness of visualization
systems.
On the other hand, complex or ‘special’
interactions need new types of user interfaces. In our
use cases, we implemented a virtual button to
convert the 3D beetle mesh into a 3D explosion
drawing and a virtual slider to browse through the
tomography image stack. And the teleport indicator
acts like a virtual pointing device. Current VR
games and applications are using various additional
ways to implement virtual control elements like
virtual menus, buttons, laser pointers etc. In general,
virtual control elements are necessary to provide
additional interaction abilities and we expect to have
common ‘best practices’ in the near future. We
recommend mimicking real world interaction
schemes in the virtual reality environment to
simplify the learning process of the VR system
users.
An open issue is actor/player collision in VR
environments. The player can easily move its hands
or head into or through virtual obstacles like objects
and walls. In traditional 2D applications, the
rendering engine can just prevent such movements.
In VR applications, this could destroy the immersion
or even could result in simulation sickness.
Some visualization systems are designed to
support collaborative work of a team of e.g. domain
scientists. Using a single VR headset isolates the
active user from the remaining team. In our VR
application use cases the images inside of the VR
headset are mirrored on a separate monitor, so team
members are able to follow the user -passively- in
the virtual environment.
It is planned to extend the beetle VR use case to
a ‘virtual museum’ and present the virtual specimen
of other insects, too. In order to provide numerous
visitors the VR experience, it would be interesting to
have cheaper devices, maybe based on smartphone
headsets. As already mentioned, the Google VR
device family has too low image quality and no
controllers. However, recently Googles Daydream
View VR headset became available, providing high
quality displays and one tracked hand controller, and
we are planning to investigate its usability for our
use cases.
The VR headsets in conjunction with tracked
controllers are powerful new tools for visualization
of higher dimensional objects. In the near future, we
expect further developments of VR software, higher
resolution HMDs, more accurate tracking, more
natural 3D controllers and multi-user VR systems.
We are just at the beginning. But the current
technology is already sufficient for serious and
exciting scientific visualization.
REFERENCES
KATRIN Collaboration, 2005. KATRIN Design Report
2004. Forschungszentrum Karlsruhe, Bericht FZKA-
7090 (2005), ISSN 0947-8620: katrin.kit.edu.
Bergmann, T., Kopmann, A., Steidl, M., Wolf, J., 2014. A
Virtual Reality Visit In A Large Scale Research
Facility For Particle Physics Education And Public
Relation, INTED2014 Proceedings, Valencia, Spain,
2830-2838, 2014.
Blender, 2016. The Blender project – Free and Open 3D
Creation Software. www.blender.org.
Epic Games, Inc., 2016. The Unreal Engine 4 game
engine. www.epicgames.com,
www.unrealengine.com.
van de Kamp, T., dos Santos Rolo, T., Vagovic, P.,
Baumbach, T., Riedel, A. 2014, Three-Dimensional
Reconstructions Come to Life – Interactive 3D PDF
Animations in Functional Morphology. PLoS ONE
9(7): e102355. doi:10.1371/journal.pone.0102355.
Ruiter, N.V., Zapf, M., Hopp, T., Dapp, R., Gemmeke, H.,
2012. Phantom image results of an optimized full 3D
USCT, Proc. SPIE 8320, Medical Imaging 2012:
Ultrasonic Imaging, Tomography, and Therapy.
Ruiter, N.V., Zapf, M., Dapp, R., Hopp, T., Kaiser, W.A.,
Gemmeke, H., 2013. First results of a clinical study
with 3D ultrasound computer tomography, 2013 IEEE
International Ultrasonics Symposium (IUS), Prague,
2013, pp. 651-654.
Tan Jerome, N., Chilingaryan, S., Shkarin, A., Kopmann,
A., Zapf, M., Lizin, A., Bergmann, T., 2017. WAVE: A
3D Online Previewing Framework for Big Data
Archives. Accepted from 8
th
International Conference
on Information Visualization Theory and Applications
(IVAPP), Porto, Portugal, February 2017.