learning situation. In Proceedings of the Fifth Inter-
national Conference of the International Society for
the Study of Argumentation (ISSA’02), Sic Sat, Ams-
terdam.
Baluja, S. and Pomerleau, D. (1994). Non-intrusive gaze-
tracking using artificial neural networks. In Neural
Information Processing Systems, Morgan Kaufman
Publishers, New York.
Cho, Y. and Neumann, U. (1998). Multi-ring color fiducial
systems for scalable fiducial tracking augmented real-
ity. In Proceedings of the Virtual Reality Annual In-
ternational Symposium (VRAIS98), page 212, Wash-
ington, DC, USA.
Collewijn, H. (1999). Multi-representationnal argumenta-
tive interactions : the case of computer-mediated com-
munication in cooperative learning situation. In H. S.
Carpenter and J.G.Robson [Eds.], Vision Research:
A practical Guide to Laboratory Methods, pages 245–
285, Oxford: Oxford Univ. Press.
Duchowski, A. T. (2002). A breadth-first survey of eye
tracking applications. In Behavior Research Methods,
Instruments, and Computers.
Fiala, M. (2005). Artag, a fiducial marker system using digi-
tal techniques. In Proceedings of the 2005 IEEE Com-
puter Society Conference on Computer Vision and
Pattern Recognition (CVPR05), Volume 2, pages 590–
596, Washington, DC, USA.
Glenstrup, A. and Nielson, T. (1995). Eye controlled media
: Prensent and future state. In Masters thesis, Univer-
sity of Copenhagen.
Henderson, J. M. and Hollingworth, A. (1998). Eye
movements during scene viewing: An overview. In
G.Underwood (Ed.), Eye Guidance in Reading and
Scene Perception.
Kato, H. and Billinghurst, M. (1999). Marker tracking and
hmd calibration for a video-based augmented reality
conferencing system. In Proceedings of the 2nd IEEE
and ACM International Workshop on Augmented Re-
ality (IWAR99), pages 85–92, Washington, DC, USA.
Kato, H., Billinghurst, M., Poupyrev, I., Imamoto, K., and
Tachibana, K. (2000). Virtual object manipulation on
a table-top ar environment. In Proceedings of the In-
ternational Symposium on Augmented Reality (ISAR
2000), pages 111–119, Munich, Germany.
Kim, K. and Ramakrishna, R. (1999). Vision-based eye-
gaze tracking for human computer interface.. In Inter-
national Conference on Systems, Man, and Cybernet-
ics, pages 324–329.
Matsumoto, Y. and Zelinsky, A. (2000). An algorithm for
real-time stereo vision implmentation of head pose
and gaze direcetion measurement. In International
Conference on Automatic Face and Gesture Recogni-
tion, pages 499–504.
Naimark, L. and Foxlin, E. (2002). Circular data ma-
trix fiducial system and robust image processing for
a wearable vision-inertial self-tracker. In Proceedings
of the International Symposium on Mixed and Aug-
mented Reality (ISMAR02), pages 27–36, Washing-
ton, DC, USA.
Pastoor, S., Liu, J., and Renault, S. (1999). An experimen-
tal multimedia system allowing 3-d visualization and
eyecontrolled interaction without user-worn devices.
In IEEE Trans. Multimedia, 1(1), pages 41–52.
Perona, P. and Malik, J. (1990). Scale-space and edge de-
tection using anisotropic diffusion. In IEEE Transac-
tions on pattern and machine intelligence. Vol 12. NO.
7, pages 629–639.
Pomplun, M., Velichkovsky, B., and Ritter, H. (1994).
An artificial neural network for high precision eye
movement tracking. In B. Nebel and L. Drescher-
Fischer (Eds.), Lectures Notes in Artificial Interlli-
gence, Springer Verlag, Berlin.
Reingold, E. M., Charness, N., Pomplun, M., and Stampe,
D. M. (2002). Visual span in expert chess players:
Evidence from eye movements. In Psychological Sci-
ence.
Rekimoto, J. (1998). Matrix : A realtime object identifica-
tion and registration method for augmented reality. In
Proceedings of the Third Asian Pacific Computer and
Human Interaction (APCHI98), page 6368, Washing-
ton, DC, USA.
Rekimoto, J. and Ayatsuka, Y. (2000). Cybercode : de-
signing augmented reality environments with visual
tags. In Proceedings of DARE 2000 on Designing aug-
mented reality environments (DARE00), pages 1–10.
Sibert, L. E. and Jacob, R. J. (2000). Evaluation of eye gaze
interaction. In Human Factors in Computing Systems:
CHI 2000 Conference Proceedings. ACM Press.
Stiefelhagen, R. and Yang, J. (1997). Gaze tracking for mul-
timodal human-computer interaction. In International
Conference on Acoustics, Speech, and Signal Process-
ing, pages 2617–2620.
Toyama, K. (1998). Look, ma . no hands!. hands-free cursor
control with real-time 3d face tracking. In Workshop
on Perceptual User Interfaces.
Wang, J. G., Sung, E., and Venkateswarlu, R. (2003). A eye
gaze estimation from a single image of one eye. In
Proceedings of the Ninth IEEE International Confer-
ence on Computer Vision (ICCV 2003) 2-Volume.
Wu, H., Chen, Q., and Wada, T. (2005). Visual direction
estimation from a monocular image. In IEICE Trans.
Inf. and Syst., Vol. E88D, No.10., pages 2277–2285.
Zhang, Z. (1999). Flexible camera calibration by viewing
a plane from unknown orientations. In IEEE Interna-
tional Conference on Computer Vision.
Zhu, J. and Yang, J. (2002). Subpixel eye gaze tracking.
In IEEE International Conference on Automatic Face
and Gesture Recognition.
EYE AND GAZE TRACKING ALGORITHM FOR COLLABORATIVE LEARNING SYSTEM
333