Authors:
Richard Müller
;
Pascal Kovacs
;
Jan Schilbach
;
Ulrich W. Eisenecker
;
Dirk Zeckzer
and
Gerik Scheuermann
Affiliation:
University of Leipzig, Germany
Keyword(s):
Software Visualization, Evaluation, Controlled Experiment, 3D.
Related
Ontology
Subjects/Areas/Topics:
Abstract Data Visualization
;
Computer Vision, Visualization and Computer Graphics
;
General Data Visualization
;
Interpretation and Evaluation Methods
;
Software Visualization
;
Usability Studies and Visualization
;
Visualization Taxonomies and Models
Abstract:
In the field of software visualization controlled experiments are an important instrument to investigate the
specific reasons, why some software visualizations excel the expectations on providing insights and ease task
solving while others fail doing so. Despite this, controlled experiments in software visualization are rare.
A reason for this is the fact that performing such evaluations in general, and particularly performing them in
a way that minimizes the threats to validity, is hard to accomplish. In this paper, we present a structured approach
on how to conduct a series of controlled experiments in order to give empirical evidence for advantages
and disadvantages of software visualizations in general and of 2D vs. 3D software visualizations in particular.