Dodge, R. and Cline, T. S. (1901). The angle velocity of
eye movements. Psychological Review, 8:145–157.
Eger, N., Ball, L. J., Stevens, R., and Dodd, J. (2007).
Cueing retrospective verbal reports in usability testing
through eye-movement replay. In BCS-HCI ’07: Pro-
ceedings of the 21st British HCI Group Annual Con-
ference on People and Computers, pages 129–137,
Swinton, UK, UK. British Computer Society.
Goldberg, J. H. and Kotval, X. P. (1999). Computer inter-
face evaluation using eye movements: methods and
constructs. International Journal of Industrial Er-
gonomics, 24(6):631 – 645.
Grindinger, T., Duchowski, A. T., and Sawyer, M. (2010).
Group-wise similarity and classification of aggregate
scanpaths. In ETRA ’10: Proceedings of the 2010
Symposium on Eye-Tracking Research & Appli-
cations, pages 101–104, New York, NY, USA. ACM.
Hutchinson, T. E., White, K. P., Martin, W. N., Reichert,
K. C., and Frey, L. A. (1989). Human-computer in-
teraction using eye-gaze input. Systems, Man and Cy-
bernetics, IEEE Transactions on, 19(6):1527–1534.
Law, B., Atkins, M. S., Kirkpatrick, A. E., and Lomax,
A. J. (2004). Eye gaze patterns differentiate novice
and experts in a virtual laparoscopic surgery training
environment. In ETRA ’04: Proceedings of the 2004
symposium on Eye tracking research & applications,
pages 41–48, New York, NY, USA. ACM.
Levoy, M. and Whitaker, R. (1990). Gaze-directed volume
rendering. SIGGRAPH Comput. Graph., 24(2):217–
223.
Lu, A., Maciejewski, R., and Ebert, D. S. (2010). Volume
composition and evaluation using eye-tracking data.
ACM Trans. Appl. Percept., 7(1):1–20.
Luebke, D., Watson, B., Cohen, J. D., Reddy, M., and
Varshney, A. (2002). Level of Detail for 3D Graph-
ics. Elsevier Science Inc., New York, NY, USA.
Mackworth, N. H. and Morandi, A. J. (1967). The gaze
selects informative details within pictures. Perception
and Psychophysics, 2:547–552.
Mannan, S. K., Ruddock, K. H., and Wooding, D. S. (1996).
The relationship between the locations of spatial fea-
tures and those of fixations made during visual exam-
ination of briefly presented images. Spatial Vision,
10:165–188.
O’Sullivan, C., Dingliana, J., and Howlett, S. (2003). Eye-
movements and interactive graphics. The Mind’s
Eyes: Cognitive and Applied Aspects of Eye Move-
ment Research, pages 555–571. J. Hyona, R. Radach,
and H. Deubel (Eds.).
Parkhurst, D. and Niebur, E. (2003). Scene content selected
by active vision. Spatial Vision, 16:125–154.
Poole, A. and Ball, L. J. (2005). Eye tracking in human-
computer interaction and usability research: Current
status and future. In Prospects, Chapter in C. Ghaoui
(Ed.): Encyclopedia of Human-Computer Interaction.
Pennsylvania: Idea Group, Inc.
Rochester Institute of Technology (2010). Cen-
ter for Computatonal Relativity and Gravitation.
http://ccrg.rit.edu/.
Space Telescope Science Institute (2010). Hubblesite.
http://hubblesite.org.
APPENDIX
Instructions to Participants. In addition to the fol-
lowing instructions, which were read to the partic-
ipants at the start of the study, participants were
also given documentation showing that the study was
reviewed and approved by the Institutional Review
Board (IRB) at the institution where this study was
conducted.
The purpose of this study is to gain a better
understanding of how humans look at images
of galactic events.
You will be shown a sequence of images
and a video. Your task will be to evaluate the
quality of each image and video by assigning
a rating from 1 to 10 with 1 being the low-
est quality and 10 being the highest. Please
state your rating when the blank screen be-
tween images is displayed. The images that
you will be viewing consist of several images
from the Hubble Space Telescope as well as
computer generated images. You will also see
a computer generated video.
During the course of the experiment, a
noninvasive camera will be used to record
your eye movements. Please try to minimize
your head movements as this may adversely
affect the quality of the results. A short cal-
ibration process is necessary to ensure that
your eyes are being accurately tracked. This
will occur at the start of the experiment. Cal-
ibration simply involves looking at the targets
on the screen until they disappear. The entire
experiment should take no longer than 10 min-
utes to complete.
The results of this study may be pub-
lished in scientific research journals or pre-
sented at professional conferences. However,
your name and identity will not be revealed
and your record will remain anonymous. Your
name will not be used in any data collection,
so it will be impossible to tell your answers
from other peoples answers.
The potential benefits of this study to so-
ciety include improvements in data visualiza-
tion techniques and the advancement of scien-
tific knowledge of human visual perception.
Participation is entirely voluntary. Addition-
ally, you may choose to withdraw from this
study at any time. If you decide not to par-
ticipate or to withdraw from this study, there
will not be a penalty to you. Do you have any
questions before we begin?
CREATING AUDIENCE SPECIFIC GALACTIC SIMULATIONS USING EYE-TRACKING TECHNOLOGY
223