RIGHT MOUSE BUTTON SURROGATE ON TOUCH SCREENS

Jan Vanek, Bruno Jezek

Abstract

The pervasiveness of computer systems is largely determined by their ease of use. Touch screens have proven to be a natural interface with a strong sensorimotor feedback. Although multi-touch technologies are ever more popular, single-touch screens are still often preferred. They are cheaper and they map directly to pointing devices such as computer mice, thus requiring no software modifications. Therefore, they can easily be integrated into existing systems with WIMP interfaces, e.g. MS Windows based systems. Applications in such systems often rely on user pressing the right mouse button to open context menus etc. Since single-touch screens register only one touch at a time, different methods are being used to allow a user to determine the outcome of the touch. The paper proposes a new interaction scheme for this purpose and an algorithm to detect it.

References

  1. Anderson, D., Bailey, C., & Skubic, M. (2004). Markov Model Symbol Recognition for Sketch-Based Interfaces. Proceedings of AAAI Fall Symposium (pp. 15-21). Presented at the AAAI Fall Symposium, Menlo Park, CA: AAAI Press.
  2. Anthes, G. (2008). Give your computer the finger: Touchscreen tech comes of age - Computerworld. Retrieved February 16, 2011, from http://www.computerworld. com/s/article/9058841/Give_your_computer_the_finge r_Touch_screen_tech_comes_of_age
  3. Buxton, B. (2011). Multi-Touch Systems that I Have Known and Loved. Multi-Touch Systems that I Have Known and Loved. Retrieved February 16, 2011, from http://www.billbuxton.com/multitouchOverview.html
  4. Cao, X., & Balakrishnan, R. (2005). Evaluation of an online adaptive gesture interface with command prediction. Proceedings of Graphics Interface 2005 (pp. 187-194). Victoria, British Columbia: Canadian Human-Computer Communications Society.
  5. Conway, C. M., & Christiansen, M. H. (2001). Sequential learning in non-human primates. Trends in Cognitive Sciences, 5(12), 539-546.
  6. Ellis, N. (2007). Sam Hurst Touches on a Few Great Ideas. Berea College Magazine, 77(4), 22-27.
  7. Guimbretière, F., Stone, M., & Winograd, T. (2001). Fluid interaction with high-resolution wall-size displays. Proceedings of the 14th annual ACM symposium on User interface software and technology, UIST 7801 (p. 21-30). New York, NY, USA: ACM.
  8. Hammond, T., & Davis, R. (2006). Tahuti: a geometrical sketch recognition system for UML class diagrams. ACM SIGGRAPH 2006 Courses, SIGGRAPH 7806. New York, NY, USA: ACM.
  9. Hashiya, K., & Kojima, S. (1997). Auditory-visual Intermodel Matching by a Chimpanzee (Pan troglodytes). Japanese Psychological Research, 39(3), 182-190.
  10. Cho, M. G. (2006). A new gesture recognition algorithm and segmentation method of Korean scripts for gesture-allowed ink editor. Information Sciences, 176(9), 1290-1303.
  11. Cho, M. G., Oh, A. S., & Lee, B. K. (2004). A FeatureBased Algorithm for Recognizing Gestures on Portable Computers. Computational Science and Its Applications - ICCSA 2004, Lecture Notes in Computer Science (Vol. 3043, pp. 33-40). Springer Berlin / Heidelberg.
  12. Kara, L. B. (2004). An Image-Based Trainable Symbol Recognizer for Sketch-Based Interfaces. Proceedings of AAAI Fall Symposium (Vol. 2, pp. 99-105). Presented at the AAAI Fall Symposium, Menlo Park, CA: AAAI Press.
  13. Knight, M. (2007). In touch with the digital age. Retrieved February 16, 2011, from http://edition.cnn.com/2007/ TECH/07/27/fs.touchscreen/index.html
  14. Lank, E., Thorley, J. S., & Chen, S. J.-S. (2000). An interactive system for recognizing hand drawn UML diagrams. Proceedings of the 2000 conference of the Centre for Advanced Studies on Collaborative research (p. 7). Mississauga, Ontario, Canada: IBM Press.
  15. McGuire, M., Bakst, K., Fairbanks, L., McGuire, M., Sachinvala, N., von Scotti, H., & Brown, N. (2000). Cognitive, mood, and functional evaluations using touchscreen technology. The Journal of Nervous and Mental Disease, 188(12), 813-817.
  16. Myers, C. S., & Rabiner, L. R. (1981). A Comparative Study Of Several Dynamic Time-Warping Algorithms For Connected Word Recognition. The Bell System Technical Journal, 60(7), 1389-1409.
  17. Nichols, S. J. V. (2007). New Interfaces at the Touch of a Fingertip. Computer, 40, 12-15.
  18. Notowidigdo, M., & Miller, R. C. (2004). Off-Line Sketch Interpretation. Proceedings of AAAI Fall Symposium (Vol. 2, pp. 120-126). Presented at the AAAI Fall Symposium, Menlo Park, CA: AAAI Press.
  19. Pittman, J. A. (1991). Recognizing handwritten text. Proceedings of the SIGCHI conference on Human factors in computing systems: Reaching through technology, CHI 7891 (p. 271-275). New York, NY, USA: ACM.
  20. Raisamo, R. (1999). Multimodal Human-Computer Interaction: a Constructive and Empirical Study (Dissertation). University of Tampere, Tampere.
  21. Rubine, D. (1991). Specifying gestures by example. ACM SIGGRAPH Computer Graphics (Vol. 25, p. 329- 337). New York, NY, USA: ACM.
  22. Sezgin, T. M., & Davis, R. (2005). HMM-based efficient sketch recognition. Proceedings of the 10th international conference on Intelligent user interfaces, IUI 7805 (p. 281-283). New York, NY, USA: ACM.
  23. Tappert, C. C. (1982). Cursive script recognition by elastic matching. IBM Journal of Research and Development, 26, 765-771.
  24. Taylor, A. G. (1997). WIMP Interfaces (Topic Report No. CS6751). Winter 7897. Atlanta, GA: Georgia Tech. Retrieved from http://www.cc.gatech.edu/classes /cs6751_97_winter/Topics/dialog-wimp/
  25. Weber, B., Schneider, B., Fritze, J., Gille, B., Hornung, S., Kühner, T., & Maurer, K. (2003). Acceptance of computerized compared to paper-and-pencil assessment in psychiatric inpatients. Computers in Human Behavior, 19(1), 81-93.
  26. Wilson, A., & Shafer, S. (2003). XWand: UI for intelligent spaces. Proceedings of the SIGCHI conference on Human factors in computing systems, CHI 7803 (p. 545-552). New York, NY, USA: ACM.
  27. Wobbrock, J. O., Wilson, A. D., & Li, Y. (2007). Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. Proceedings of the 20th annual ACM symposium on User interface software and technology, UIST 7807 (p. 159-168). New York, NY, USA: ACM.
Download


Paper Citation


in Harvard Style

Vanek J. and Jezek B. (2011). RIGHT MOUSE BUTTON SURROGATE ON TOUCH SCREENS . In Proceedings of the 13th International Conference on Enterprise Information Systems - Volume 4: ICEIS, ISBN 978-989-8425-56-0, pages 304-309. DOI: 10.5220/0003504803040309


in Bibtex Style

@conference{iceis11,
author={Jan Vanek and Bruno Jezek},
title={RIGHT MOUSE BUTTON SURROGATE ON TOUCH SCREENS},
booktitle={Proceedings of the 13th International Conference on Enterprise Information Systems - Volume 4: ICEIS,},
year={2011},
pages={304-309},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0003504803040309},
isbn={978-989-8425-56-0},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 13th International Conference on Enterprise Information Systems - Volume 4: ICEIS,
TI - RIGHT MOUSE BUTTON SURROGATE ON TOUCH SCREENS
SN - 978-989-8425-56-0
AU - Vanek J.
AU - Jezek B.
PY - 2011
SP - 304
EP - 309
DO - 10.5220/0003504803040309