application and answered a questionnaire with
detailed information about their opinion. The
application was improved using the suggestions.
The main conclusions obtained were that they
considered useful to be able to select the avatar main
hand to sign, because the different signing method of
the deaf people and as they have different literacy
levels, the speed of the avatar should be configured.
Regarding the sign language must be extended
with more vocabulary adapted to the specific areas
and updated with neologisms.
As for the application some signs were not clear
enough and had little vocabulary, some movements
were rough and needed to be smoothened, and the
avatar had little facial expression.
At last, the tool presents some advantages and
improvements from existing ones. From their point
of view, it is a very useful system for on-line courses
or as a visual book. In a short time, this application
will be useful in secondary school classrooms, to
study at home or review Spanish material, because
digital written materials are difficult for them (some
of them cannot read). Finally, in their opinion it is an
incredible system in the area of new technologies,
original and it will suppose a big transformation in
the training of deaf people, making training and
academic courses more accessible for them.
8 CONCLUSIONS AND FUTURE
WORK
In this paper an automatic Spanish to LSE translator
for academic purposes, from voice and PowerPoint
data, was explained, reviewing all the application
functionality and the results of the test carried out to
obtain direct information from deaf people.
After the test, the system was completed to
incorporate some ideas and solve some problems
detailed by the deaf group, but we still have some
improvements to be done as future work.
It is necessary to improve the avatar facial
expression, adding gestures that complete the
different signs and make them more understandable.
It is important to review the clarity of all the
vocabulary, making a deep test with deaf people.
The improvement of the chat between teacher and
student is been under development, to incorporate a
visual interface with signs pictograms and
animations in the student side, to facilitate its use by
deaf people.
After these improvements, a new test will be
necessary to check if the system will be useful
enough to incorporate it in some specific courses.
ACKNOWLEDGEMENTS
The authors thank the Junta de Andalucía for their
collaboration in the project, to Elena Gándara for her
expert collaboration and help in the rules extraction
and signs capture and the Deaf Association of
Seville for their participation in the test.
REFERENCES
Akihabara. 2012 NHK introduces Automatic Animated
Sign Language Generation System. In Akihabara
News webpage. http://en.akihabaranews.com/.
Bangham, J. A., et al., 2000. Virtual Signing: Capture,
Animation, Storage and Transmission An Overview of
ViSiCAST. In IEE Seminar on Speech and language
processing for disabled and elderly people.
Elliot, R. , Glauert, J. R. W. , Kennaway, J. R. , Marshall,
I., 2000. The Development of Language Processing
Support for the ViSiCAST Project. In ASSETS 2000.
Kato, N., Kaneko, H., Inoue, S., 2011. Machine
translation to Sign Language with CG-animation. In
Tehcnical Review. No. 245.
Kipp, M. Heloir, A. , Nguyen, Q. 2011. Sign Language
Avatars: Animation and Comprehensibility.IVA 2011,
LNAI 6895, pp. 113-126.
Kuroda,T., Sato, K., Chihara, K., 1998. S-TEL: An avatar
based sign language telecommunication system. In
The International Journal of Virtual Reality. Vol 3,
No.4.
Liang,R. , Ouhyoung, M. ,1998. A real-time continuous
gesture recognition system for sign language. In
Automatic Face and Gesture Recognition, Proc. Third
IEEE International Conference on pp.558-567.
Sagawa, H. and Takeuchi, M. , 2000. A Method for
Recognizing a Sequence of Sign Language Words
Represented in a Japanese Sign Language Sentence. In
Proc. of the Fourth IEEE International Conference on
Automatic Face and Gesture Recognition 2000 (FG
'00). IEEE Computer Society,
San Segundo et al, 2012. Design, Development and Field
Evaluation of a Spanish into Sign Language
Translation System. In Pattern Analysis and
Applications: Volume 15, Issue 2 (May 2012), Page
203-224,
Verlinden, M. , Tijsseling,C. , Frowein H. ,2001. A
Signing Avatar on the WWW. In International
Gesture Workshop 2001.
Verlinden, M., Zwitserlood, I., Frowein, H. , 2005.
Multimedia with Animated Sign Language for Deaf
Learners. In World Conference on Educational
Multimedia,Hypermedia & Telecommunications, pp.
4759-4764.
Zwiterslood, I. , Verlinden, M. , Ros, J. , van der Schoot,
S. ,2004. Synthetic Signing for the Deaf: eSIGN. In
Proc. of the Conf. and Workshop on Assistive
Technologies for Vision and Hearing Impairment,
CVHI 2004, 29
GRAPP2013-InternationalConferenceonComputerGraphicsTheoryandApplications
266