Table 2: The numbers of reference gestures with distances
from etalon exceeding example thresholds and random
gestures with exactly one point of return and with
distances below the thresholds.
Threshold
D
Ref
> threshold D
Rand
≤ threshold
2 175 1
4 36 22
6 8 101
8 2 190
10 1 258
12 0 326
4 RESULTS AND CONCLUSIONS
The algorithm was implemented as a prototype using
DirectInput API to read coordinates in background.
The resulting application was tested on a resistive
touch display ADI V-Touch 1710, capacitive touch
display NEC V-Touch 1921 CU and resistive touch
wall SmartBoard 540. Computational load was
immeasurable on all the computer setups. Gesture
detection error rate and its dependence on the
distance threshold corresponded to expectations. The
touch wall produced high noise in the recorded data,
which caused frequent points of return and gesture
rejection with segment lengths close to one or two
pixels. To remove the noise a segment was recorded
only after it reached a defined minimal length. The
minimal length of four pixels yielded results
equivalent to those on the displays.
To assess the efficiency of the proposed method
of right mouse button surrogate and its comparison
to the tap&hold method, software button method and
hardware button method an experiment was
designed, in which users react to a series of
graphical symbols with either left or right virtual
mouse button press in dependence on the currently
displayed symbol. Expected type of reaction,
reaction time and the number of corrections are
recorded in the course of the task.
For technical, organizational and economic
reasons the experiment is yet to be performed on a
statistically significant sample of users. Thorough
analysis of the method impact on user performance
thus remains future work. However, preliminary
tests taken by a limited number of users suggest the
potential of the method especially in comparison
with the button methods. The tests also show that the
method requires some dexterity and practice, but
that the touch interaction scheme is intuitive.
The proposed method of right mouse button
surrogate on touch screens is a sound alternative to
other existing methods in use, but it is not intended
as a complete replacement. The described detection
algorithm of the proposed touch interaction scheme
is computationally efficient and easy to implement
and therefore it can be integrated both into the
software driver and the firmware of the underlying
touch screen hardware.
ACKNOWLEDGEMENTS
This work was supported by the Ministry of Defence
research project MO0 FVZ0000604.
REFERENCES
Anderson, D., Bailey, C., & Skubic, M. (2004). Markov
Model Symbol Recognition for Sketch-Based
Interfaces. Proceedings of AAAI Fall Symposium (pp.
15-21). Presented at the AAAI Fall Symposium,
Menlo Park, CA: AAAI Press.
Anthes, G. (2008). Give your computer the finger: Touch-
screen tech comes of age - Computerworld. Retrieved
February 16, 2011, from http://www.computerworld.
com/s/article/9058841/Give_your_computer_the_finge
r_Touch_screen_tech_comes_of_age
Buxton, B. (2011). Multi-Touch Systems that I Have
Known and Loved. Multi-Touch Systems that I Have
Known and Loved. Retrieved February 16, 2011, from
http://www.billbuxton.com/multitouchOverview.html
Cao, X., & Balakrishnan, R. (2005). Evaluation of an on-
line adaptive gesture interface with command
prediction. Proceedings of Graphics Interface 2005
(pp. 187-194). Victoria, British Columbia: Canadian
Human-Computer Communications Society.
Conway, C. M., & Christiansen, M. H. (2001). Sequential
learning in non-human primates. Trends in Cognitive
Sciences, 5(12), 539-546.
Ellis, N. (2007). Sam Hurst Touches on a Few Great Ideas.
Berea College Magazine, 77(4), 22-27.
Guimbretière, F., Stone, M., & Winograd, T. (2001). Fluid
interaction with high-resolution wall-size displays.
Proceedings of the 14th annual ACM symposium on
User interface software and technology, UIST ’01 (p.
21–30). New York, NY, USA: ACM.
Hammond, T., & Davis, R. (2006). Tahuti: a geometrical
sketch recognition system for UML class diagrams.
ACM SIGGRAPH 2006 Courses, SIGGRAPH ’06.
New York, NY, USA: ACM.
Hashiya, K., & Kojima, S. (1997). Auditory-visual
Intermodel Matching by a Chimpanzee (Pan
troglodytes). Japanese Psychological Research, 39(3),
182-190.
Cho, M. G. (2006). A new gesture recognition algorithm
and segmentation method of Korean scripts for
gesture-allowed ink editor. Information Sciences,
176(9), 1290-1303.
ICEIS 2011 - 13th International Conference on Enterprise Information Systems
308