R&D of the Japanese Input Method using Life Log on an Eye-controlled Communication Device for Users with Disabilities

Kazuaki Shoji, Hiromi Watanabe, Shinji Kotani

Abstract

We aim to enable the smooth communication of persons physically unable to speak. In our past study, we proposed three Japanese input methods using a portable eye- controlled communication device for users with conditions such as cerebral palsy or amyotrophic lateral sclerosis (ALS). However, these methods require nearly 30 seconds to cycle through one Japanese character. In this paper, we suggest a method to estimate the input word using the clues of nearby characters and accumulated experience. In addition, to raise the precision of the prediction, we use the connection between words based on a thesaurus. We have realized precise word conversion via a few input letters, as proved by the result of the simulation experiment.

References

  1. Chen, H., Schatz, B, R., Yim, T. and Fye, D., 1995, Automatic thesaurus generation for an electronic community system, Journal of the American Society for Information Science (JASIS), 46(3), pp.175-193.
  2. Chen, Z., Liu, S., Wenyin, L., Pu, G. and Ma, W,Y., 2003, Building a Web Thesaurus from Web Link Structure, Proc. of the ACM SIGIR, pp.48-55.
  3. Coates-Stephens, S., 1993, The analysis and acquisition of proper names for the understanding of free text, In Computers and the Humanities, Vol.26, pp.441-456.
  4. Hori, J. and Saitoh, Y., 2006, Development of a Communication Support Device Controlled by Eye Movements and Voluntary Eye Blink, IEICE TRANS. INF.&SYST., vol.E89-D, no.6, pp.1790-1797.
  5. Kageura,K., Tsuji, K. and Aizawa, A., 2000, Automatic thesaurus generation through multiple filtering, Proc. of the 18th International Conference on Computational Linguistics, pp.397-403.
  6. Kotani, S. Ohgi, K., Watanabe, H., Komasaki, T. and Yamamoto, Y., 2010, R&D of the Japanese Input Method using an eye-controlled communication device for users with disabilities and evaluation with NIRS, Proc. 2010 IEEE International Confrence on Systems, Man, and cybernetics (SMC), Isutanbul, pp.2545-2550.
  7. Kotani, S., Tanzawa, T., Watanabe, H., Ohgi, K., Komasaki, T. And Kenmotsu, T., 2011, Proficiency evaluation of three Japanese input methods using an eye-Controlled communication device for users with disabilities, Proc. 2011 IEEE International Conference on Systems, Man, and cybernetics (SMC2011), Alaska, pp.3230-3235.
  8. Lanitis, A., Taylor, C, J. and Cootes, T, F., 1996, An automatic face identification system using flexible appearance models, Image and Vision Computing, vol.13, no.5, pp.393-401.
  9. Lee, A., Kawahara, T. and Shikano, K., 2001, Julius -- an open source real-time large vocabulary recognition engine, Proc. EUROSPEECH, pp.1691-1694.
  10. Ministry of Health, Labour and Welfare., 2008. Statics data. in Japanese.
  11. Moosmann, M,. Ritter, P,. Krastel, I,. Brink, A., Thees, S., Blankenburg, F., Taskin, B., Obrig, H. and Villringer, A., 2003, Correlates of alpha rhythm in functional magnetic resonance imaging and near infrared spectroscopy, NeuroImage, 20, pp.145-158.
  12. Nakadai, K., Takahashi, T., Okuno, H, G., Nakajima, H., Hasegawa, Y. and Tsujino, H., 2010, Design and Implementation of Robot Audition System "HARK", Advanced Robotics, vol.24, no.5-6, pp.739-761.
  13. Noda, T., Shirai, H., Kuroiwa, J., Odaka, T. and Ogura, H., 2008, The Japanese Input Method Base on the Example for a personal Digital Assistant, Memories of the Graduate School of Engineering, University of Fukui, 56, pp.69-76.
  14. Okamoto, M., Dan, H., Sakamoto, K., Takeo, K., Shimizu, K., Kohno, S., Oda, I., Isobe, S., Suzuki, T., Kohyama, K. and Dan, I., 2004, Three-dimensional probabilistic anatomical cranio-cerebral correlation via the international 10-20 system oriented for transcranial functional brain mapping, NeuroImage, 21, pp.99-111.
  15. Roche-Labarbe, N., Zaaimi, B., Berquin, P., Nehlig, A., Grebe, R. and Wallois F.,2008, NIRS-measured oxyand deoxyhemoglobin changes associated with EEG spike-and wave discharges in children, Spilepsia, 49(11), pp.1871-1880.
  16. Villringer, A., Planck, J., Hock, C., Schleinkofer, L. and Dirnagl, U., 1993, Nera infrared spectroscopy (NIRS): A new tool to study hemodynamic changes during activation of brain function in human adults, Neuroscience Letters, vol.154, Issues 1-2, pp.101-104.
  17. Yamamoto, M., Nagamatsu, T. And Watanabe, T., 2009, Development of an Eye-Tracking Pen Display based on the Stereo Bright Pupil Technique, Technical report of IEICE. HCS, 109(27), pp.147-150.
  18. Zhang, W., Zelinsky,G. and Samaras, D., 2007, Real-Time Accurate Object Detection Using Multiple Resolutions, Proc. IEEE Intenational Conference Computer Vision, pp. 1-8, 2007.
Download


Paper Citation


in Harvard Style

Shoji K., Watanabe H. and Kotani S. (2014). R&D of the Japanese Input Method using Life Log on an Eye-controlled Communication Device for Users with Disabilities . In Proceedings of the International Conference on Physiological Computing Systems - Volume 1: PhyCS, ISBN 978-989-758-006-2, pages 207-213. DOI: 10.5220/0004681202070213


in Bibtex Style

@conference{phycs14,
author={Kazuaki Shoji and Hiromi Watanabe and Shinji Kotani},
title={R&D of the Japanese Input Method using Life Log on an Eye-controlled Communication Device for Users with Disabilities},
booktitle={Proceedings of the International Conference on Physiological Computing Systems - Volume 1: PhyCS,},
year={2014},
pages={207-213},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004681202070213},
isbn={978-989-758-006-2},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Physiological Computing Systems - Volume 1: PhyCS,
TI - R&D of the Japanese Input Method using Life Log on an Eye-controlled Communication Device for Users with Disabilities
SN - 978-989-758-006-2
AU - Shoji K.
AU - Watanabe H.
AU - Kotani S.
PY - 2014
SP - 207
EP - 213
DO - 10.5220/0004681202070213