Table 4: Recognition rates of the three states of the im-
proved system (%).
Neutral
Strongly-
staring
Softly-
staring
Widely-
Opening
Softly-
opening
Average 93.0 91.0 81.0 94.0 74.0
SD 6.4 7.0 9.4 6.6 6.6
basis of the results, we find that the eyelid-movement
input method improves the user’s immersive impres-
sion more significantly than the mouse input method.
Therefore, we conclude that it is important to immerse
an input method in an immersive virtual environment
because the user can use it intuitively and does not
have to be concerned about the seam between the real
and the virtual worlds. We also conclude that physio-
logical data are useful in continuously and objectively
evaluating usability of an input method.
For future work, we plan to further improve
the user’s immersive experience by applying real-
world embodiment movements and emotions esti-
mated from a user’s physiological data, such as EMG,
SC, BVP, brain waves and cerebral blood flow.
REFERENCES
Agustin, J. S., Hansen, J. P., Hansen, D. W., and Skovs-
gaard, H. H. T. (2009). Low-cost gaze pointing and
emg clicking. chi extended abstracts. CHI Extended
Abstracts 2009, pages 3247–3252.
Asai, K., Osawa, N., Sugimoto, Y., and Tanaka, Y. (2002).
Viewpoint motion control by body position in im-
mersive projection display. Proc. of SAC2002, pages
1074–1079.
Costanza, E., Inverso, S. A., Allen, R., and Maes, P. (2007).
Intimate interfaces in action: assessing the usability
and subtlety of emg-based motionless gestures. Proc.
of CHI 2007, pages 819–828.
Gibert, G., Pruzinec, M., Schultz, T., and Stevens, C.
(2009). Enhancement of human computer interac-
tion with facial electromyographic sensors. Proc. of
OZCHI 2009, pages 421–424.
Haffegee, A., Alexandrov, V. N., and Barrow, R. (2007).
Eye tracking and gaze vector calculation within im-
mersive virtual environments. Proc. of VRSTACM,
pages 225–226.
Jr., F. C., Nguyen, D., Guerra-Filho, G., and Huber, M.
(2010). Identification of static and dynamic muscle
activation patterns for intuitive human/computer inter-
faces. Proc. of PETRA 2010.
Kikuya, K., Hara, S., Shinzato, Y., Ijichi, K., Abe, H., Mat-
suno, T., Shikata, K., and Ohshima, T. (2011). Hyak-
ki men: Development of a mixed reality attraction
with gestural user interface. INTERACTION 2011,
2011(3):469–472.
Lin, T., Hu, W., Omata, M., and Imamiya, A. (2005). Do
physiological data relate to traditional usability in-
dexes? Proc. of OZCHI ’05.
Manabe, T., Tamura, H., and Tanno, K. (2009). The control
experiments of the electric wheelchair using s-emg of
facial muscles. Proc. FIT2009, pages 541–542.
Manders, C., Farbiz, F., Tang, K. Y., Yuan, M., Chong, B.,
and Chua, G. G. (2008). Interacting with 3d objects
in a virtual environment using an intuitive gesture sys-
tem. Proc. of VRCAI08.
Meehan, M., Insko, B., Whitton, M., and Brooks, F. P.
(2002). Physiological measures of presence in stress-
ful virtual environments. ACM Trans. Graph. In SIG-
GRAPH ’02, 21(3):645–652.
Miyashita, H., Hayashi, M., and ichi Okada, K. (2008).
Implementation of eog-based gaze estimation in hmd
with head-tracker. Proc. of ICAT2008, pages 20–27.
Nagahara, H., Yagi, Y., and Yachida, M. (2005). A
wide-field-of-view catadioptrical head-mounted dis-
play. Electronics and Communications in Japan, J88-
D-II(1):95–104.
Oshita, M. (2006). Motion-capture-based avatar control
framework in third-person view virtual environments.
Proc. of ACE ’06.
Ries, B., Interrante, V., Kaeding, M., and Anderson, L.
(2008). The effect of self-embodiment on distance
perception in immersive virtual environments. Proc.
of VRST 2008, 15(5):167–170.
Soleymani, M., Chanel, G., Kierkels, J. J. M., and Pun,
T. (2008). Affective ranking of movie scenes using
physiological signals and content analysis. Proc. of
MSACM, pages 32–39.
Steinicke, F., Bruder, G., Hinrichs, K. H., and Steed, A.
(2009). Presence-enhancing real walking user inter-
face for first-person video games. Proc. of ACM SIG-
GRAPH Symposium on Video Games, pages 111–118.
TheAuthors (2011). *****. *****.
Touyama, H., Hirota, K., and Hirose, M. (2006). Imple-
mentation of electromyogram interface in cabin im-
mersive multiscreen display. Proc. of IEEE Virtual
Reality 2006, pages 273–276.
Valstar, M., Pantic, M., Ambadar, Z., and Cohn, J. F. (2006).
Spontaneous vs. posed facial behavior: Automatic
analysis of brow actions. Proc. of ICMI ’06, pages
162–170.
APhysiologicalEvaluationofImmersiveExperienceofaViewControlMethodusingEyelidEMG
231