3 DATA COLLECTION AND
ANALYSIS METHOD
With the aim to shed some light on ways of better
adopting EA, we studied how methods and tools are
used to explore EA potentials for organizational
performance from the tool vendor perspective.
To do so, we have collected evidence from
publicly available content on official EA tool vendor
websites and websites providing third-party reviews
of the EA tools. We primarily used the grey
literature review approach (Garousi, Felderer, &
Mäntylä, 2016) to collect and analyse the data. Grey
literature reviews have been acknowledged as a valid
alternative to academic literature reviews when the
state of the practice is concerned, as they can give
substantial benefits (Garousi et al., 2016).
The evidence mainly comes from the website
contents of 16 leading EA tools. There are three
reasons for us to review such website contents. First,
tools are both instrumental and very important in EA
discipline (Korhonen et al., 2016). Second, tools in
general make it easier for users to accept one
technology. For EA, user acceptance was perceived
as one of the critical challenges. Thus, we assume that
tool support could facilitate EA application. Third,
according to our preliminary observation, the content
offered in the tool vendor websites is rich and
informative. Many white papers, use cases, and
feature descriptions were provided on the vendors’
websites to provide knowledge to their potential
customers and show vendors’ expertise.
We collected data primarily from 16 websites of
EA tool vendors. The vendors were selected from the
list of vendors administered in Gartner’s (Forbes
Media LLC., 2021) annual report named “Gartner
Magic Quadrant for Enterprise Architecture Tools”
(Gartner, 2020), which includes long-established
manufacturers as well as insightful new challengers.
We believe that how these leading vendors apply EA
represents the current trend of first-line EA
applications. To complement the opinion and
information declaimed by the vendors themselves, we
have also referred to user reviews available in
(Gartner, 2021). The user reviews were verified as
explained by (I. Gartner, 2020) to ensure their quality
and reliability according to some criteria, such as not
containing plagiarized content and highlighting
experiences related to vendors/products. Some
reflections were also triangulated with the analysis of
user reviews in IT Central Station (IT Central Station,
2020), which unfortunately have not been explicitly
presented in this paper due to the space limitations.
Our data analysis aimed to compare the state of
the art as reported in a recent comprehensive study of
organizations applying EA (Kotusev, 2019) (further
referred to as “results-of-survey-study”), with the
recommendations suggested by the tool vendors
(further referred to as “vendor recommendations”).
We used (Kotusev, 2019) as a representative of the
state of the practice because it proposed clear
statements about the comprehensive EA application
which makes it easier for us to make the comparison.
Notably, it was not easy to compare evidence
extracted from the empirical study and the tool
vendor websites, as the concepts and structures often
differed. In fact, terminology misalignment in
scientific papers is a known issue (Korhonen et al.,
2016). In order to compare and map different aspects
of EA implementation that differ, we focused on four
essential aspects of EA application: how to use, how
to create, how to organize, and how to regulate EA
artefacts. Our analysis started by reading through the
contents of the websites and gaining an initial
understanding of the overall breadth and depth of the
information and supporting evidence. Next, we
extracted evidence relevant to the four chosen aspects
of EA application. As similar evidence was presented
on multiple websites and for multiple products, we
chose the most representative formulations (clear and
complete statements). As a result, evidence presented
in this paper mainly came from the websites of six
vendors: Avolution (Avolution, 2021a), Sparx (Sparx
Systems Pty Ltd., 2021), Ardoq (Ardoq AS., 2021),
ValueBlue (ValueBlue B.V., 2021), Mega (MEGA
International, 2021), LeanIX (LeanIX, 2021).
4 DATA ANALYSIS RESULTS
In this section, we present four aspects of how to use,
create, organize, and regulate EA as critically
evaluated in a recent empirical study (Kotusev, 2019)
presented as “results-of-survey-study” versus as
suggested by the tool vendors accomplished with
some user reviews as “vendor recommendations.”
“Reflections” are derived based on the analysis of the
differences between the extracted evidences.
The four reflections related to EA are summarized
as:
• Roadmap (EA usage): empirically invalid versus
feasible and useful.
• EA (organizations): not a single description for all
stakeholders versus a single, comprehensive, and
valuable repository.
• EAFs/meta-models (EA regulations): purely
declarative versus fundamental.