Expert vs Novice Evaluators - Comparison of Heuristic Evaluation Assessment

Magdalena Borys, Maciej Laskowski

Abstract

In this paper authors propose the comparison between results of website heuristic evaluation performed by small group of experts and large group of novice evaluators. Normally, heuristic evaluation is performed by few experts and requires their knowledge and experience to apply heuristics effectively. However, research involving usability experts is usually very costly. Therefore, authors propose the experiment in order to contrast the results of evaluation performed by novice evaluators that are familiar with assessed website to the results obtained from expert evaluators in order to verify if they are comparable. The usability of website was evaluated using the authors’ heuristics with extended list of control questions.

References

  1. Borys, M., Laskowski, M., Milosz, M., 2013. Memorability experiment vs. expert method in websites usability evaluation. In ICEIS 2013 - Proceedings of the 15th International Conference on Enterprise Information Systems, 3, SciTePress.
  2. Botella, F., Alarcon, E., Penalver, A., 2013. A new proposal for improving heuristic evaluation reports performed by novice evaluators. In Proceeding ChileCHI 7813 Proceedings of the 2013 Chilean Conference on Human - Computer Interaction. ACM, NY, USA, pp. 72-75.
  3. Connell, I. W., 2000. Full Principles Set. Set of 30 usability evaluation principles compiled by the author from the HCI literature. In http://www0.cs.ucl.ac.uk/ staff/i.connell/DocsPDF/PrinciplesSet.pdf.
  4. Dillon, A., Song, M., 1997. An empirical comparison of the usability for novice and expert searchers of a textual and a graphic interface to an art-resource database. Digital Information.
  5. Frøkjaer, E. and Hornbaek, K. 2008. Metaphors of human thinking for usability inspection and design. In ACM Transaction on Computer-Human Interaction, 14 (4), pp. 1-33.
  6. Gerhardt-Powals, J., 1996. Cognitive engineering principles for enhancing human - computer performance, In International Journal of HumanComputer Interaction, 8(2), pp. 189-211.
  7. Hertzum, M., Jacobsen, N. E. 2001. The evaluator effect: A chilling fact about usability evaluation methods. In International Journal of Human-Computer Interaction, 13(4), pp. 421-443.
  8. Hollingsed T., Novick D. G., 2007. Usability Inspection Methods after 15 Years of Research and Practice. In SIGDOC 7807 Proceedings of the 25th annual ACM international conference on Design of communication, NY, pp. 249-255.
  9. Jochen P. et al., 1991. Errors in computerized office work: Difference between novice and expert users. ACM SIGCHI Bulletin.
  10. Koyani S., Bailey R. W., Nall J. R., 2004. Research-Based Web Design & Usability Guidelines. In Computer Psychology.
  11. Landauer Th. K., 1996. The Trouble with Computers: Usefulness, Usability, and Productivity. MIT Press.
  12. Lanzilotti, R., Ardito, C., Costabile, M. F., De Angeli, A., 2011. Do patterns help novice evaluators? A comparative study. In International Journal of Human-Computer Studies, 69(1-2), pp. 52-69.
  13. Ling, Ch., Salvendy, G., 2009. Effect of evaluators' cognitive style on heuristic evaluation: Field dependent and field independent evaluators. In International Journal of Human-Computer Studies, 67(4), pp. 382-393.
  14. Nielsen, J., and Molich, R., 1990. Heuristic evaluation of user interfaces. In Proceedings ACM CHI'90 Conference, pp. 249-256.
  15. Weinschenk, S., Barker, D. T., 2000. Designing Effective Speech Interfaces, Wiley, 1 edition.
Download


Paper Citation


in Harvard Style

Borys M. and Laskowski M. (2014). Expert vs Novice Evaluators - Comparison of Heuristic Evaluation Assessment . In Proceedings of the 16th International Conference on Enterprise Information Systems - Volume 3: ICEIS, ISBN 978-989-758-029-1, pages 144-149. DOI: 10.5220/0004970901440149


in Bibtex Style

@conference{iceis14,
author={Magdalena Borys and Maciej Laskowski},
title={Expert vs Novice Evaluators - Comparison of Heuristic Evaluation Assessment},
booktitle={Proceedings of the 16th International Conference on Enterprise Information Systems - Volume 3: ICEIS,},
year={2014},
pages={144-149},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004970901440149},
isbn={978-989-758-029-1},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 16th International Conference on Enterprise Information Systems - Volume 3: ICEIS,
TI - Expert vs Novice Evaluators - Comparison of Heuristic Evaluation Assessment
SN - 978-989-758-029-1
AU - Borys M.
AU - Laskowski M.
PY - 2014
SP - 144
EP - 149
DO - 10.5220/0004970901440149