Table 1: Ontology Design - Classes and Properties modeling.
Element Definition
Classes Abstract, Acknowledgments, Analysis, Appendices, ConclusionsFutureWork, Discussion, DiscussionSPL, Documentation, Evaluation,
ExecutionSection, Experiment, ExperimentSPL, ExperimentPlanning, ExperimentPlanningSPL, Introduction, Package, References, Re-
latatedWork, TypeContextExperiment, TypeContextSelection, TypeDesignExperiment, TypeEsperiment, TypeEsperimentSPL, TypeSe-
lectioParticipantObjects
Object
Properties
documentation, experiment, typeContextxperiment, typeContextSelection, typeDesignExperiment, typeExperiment, typeExperi-
mentSPL, typeSelectionOfParticipants
Data
Properties
idExperiment, title, authorship, publicationYear, publicationType, publicationVenue, pagesNumber, idExperimentSPL, nameSPLUsed,
wasTheSPLSourceUsedInformed, idDocumentation, useTemplate, template, observationsAboutTemplateUsed, idAbstract, objective,
abstractBackground, methods, results, limitations, conclusions, keywords, idIntroduction, problemStatement, researchObjective, con-
text, idRelatedWork, technologyUnderInvestigation, alternativeTechnologies, relatedStudies, relevancePractice, idConclusionsFuture-
Work, summary, impact, futureWork, idExperimentPlanning, goals, experimentalUnits, experimentalMaterial, tasks, hypotheses, pa-
rameters, variables, experimentDesign, procedureProcedure, explicitQuesiExperimentInStudy, isAQuasiExperiment, idExperimentPlan-
ningSPL, artifactSPLused, idExecutionSection, preparation, deviations, pilotProjectCarriedOut, howManyPilotProjectCarriedOut, id-
Analysis, descriptiveStatistics, datasetPreparation, hyp othesisTesting, whatQualitativeAnalysisPerformed, howDatahasBeenAnalyzed,
experimentAnalysisBasedPValue, hasQualitativeAnalysisOfExperiment, studyHasPerformMetaAnalysis, idDiscussion, evaluationOfRe-
sultsAndImplications, inferences, lessonsLearned, threatsValidity, isFollowThreatsByWohlin, idDiscussionSPL, threatsValiditySPL,
idAcknowledgements, acknowledgments, idReferences, references, idAppendices, appendicies, idEvaluation, theAuthorsConcernedE-
valuatingTheQuality, idPackage, isExperimentalPackageInformed, url, isLinkAvailable
process.
3.4 Use Case Scenario
In order to illustrate a potential application for the
proposed ontology, we present a simple use case sce-
nario that extracts the most used experiment report
template.
The query SPARQL (Prud’hommeaux and
Seaborne, 2008) on Listing 1 returns all experiments
template and how many times each have been used.
From that, we can extract the most used template.
Listing 1: Example of an SPARQL query.
SELECT
? t e m p l a t e
( c o u n t ( ? t e m p l a t e ) a s ? c o u n t )
WHERE {
? doc r d f : t y p e : D o c u m e n t a t i o n .
? doc : t e m p l a t e ? t e m p l a t e .
}
GROUP BY ? t e m p l a t e
This example runs through one class (Documenta-
tion) of the 24 classes in the ontology, one data prop-
erty (template) of 87 data properties. Based on that,
in the example used 0.0004% of the response capac-
ity that the model allows. This calculation checks the
possibilities of paths between classes and ontology
properties.
This initial query example shows how inference
mechanisms can be created in the ontology model
proposed in this work. Thus, it is possible to
extract information about SPL experiments using
OntoExper-SPL.
3.5 Preliminary Evaluation
The OOPS! tool was used to generate the assessment
of the proposed ontology model. The tool helps to
detect some of the most common pitfalls that appear
when developing ontologies (Poveda-Villal
´
on et al.,
2014). The OOPS! tool has 41 evaluation points of
which 34 points are semi-automatically run, as the
others depend on a specific ontology domain and they
encourage users to improve the tool. The result given
by the tool suggests how the elements of the ontol-
ogy could be modified to improve it. However, not
all identified pitfalls should be interpreted as failures,
but as suggestions that should be reviewed manually
in some cases.
The tool lists the results of each trap as: critical,
important and minor.
We summarized the results when running our on-
tology model proposal in the OOPS! tool in Table 2:
Table 2: OOPS! Traps.
Trap
ID
Description Level
P08 Missing annotations in 119 cases Minor
P10 Missing disjointedness Important
P13 Inverse relationships not explicitly
declared in 8 cases
Minor
P19 Defining multiple domains or
ranges in properties in 6 cases
Critical
P41 No license declared Important
Given the analysis of the OOPS! tool, we corrected
the pitfalls found.
OntoExper-SPL: An Ontology for Software Product Line Experiments
405