applications and not for an evaluation of what is and/
or has been developed.
The objective of this paper is, therefore, twofold:
1) to define and create a set of procedures and tests
that can really be implemented to compare the
performance of applications developed through any
(native or non-native) development frameworks;
2) to help developers understand which multiplatform
framework is better suited to the intended objectives
in each application to be developed.
The remainder of this paper is organized as
follows: in the next section we gather related work
regarding performance assessment studies in mobile
applications. In section 3 we propose an approach for
preparing, designing, testing and concluding about
performance assessment in mobile applications.
Section 4 describes our case study and associated test
cases, and in section 5 we present the obtained
performance assessment results, using the proposed
approach. Finally, section 6 concludes the paper and
points out further research directions.
2 RELATED WORK
Using the following search string "cross platform
frameworks" OR "hybrid mobile frameworks" OR
"native mobile frameworks" performance metrics on
the main scientific libraries (ACM, DBLP, IEEE and
Google Scholar), and applying a date filter to show
only the results after 2018, we could obtain 121
publications. From these, 100 were rejected because
they did not fit the theme correctly. Of the remaining
21, we could collect valuable information that helped
in the development of this work, namely:
The testing tools used to assess performance
(see, e.g., Asp Handledare et al. (2017));
The performance metrics considered (see, e.g.,
Eskola (2018));
The kind of features that were tested (see, e.g.,
Scharfstein and Gaurf (2013)).
In order to compose our approach, we considered the
most referenced items within these three types of
collected information. For the kind of mobile app
features that were tested (3
rd
type) we made another
search to find the most downloaded apps in 2019 and
selected some of the most used features.
Our literature review also revealed that there are
no “best” multiplatform development frameworks,
but some can be best suited to a given situation,
depending on the purpose and requirements of the
intended mobile application. Additionally, the
evaluation of the performance of a mobile application
is a complex process, which can easily be discussed.
For instance, one of the evaluation steps that we
found to be critical is to assure that applications are
running in Release Mode when executing
performance tests ; or, alternatively, in a mode
dedicated to the evaluation of applications (if
available) (Apple Inc. 2015; Lockwood 2013).
3 EVALUATION PROCESS
PROPOSAL
Our evaluation process began by first identifying
related works that already existed regarding mobile
app performance assessment. Then, multiple mobile
application development frameworks were evaluated
to be used in the course of this performance
assessment. With the frameworks selected, before
proceeding, we analysed the possibilities for carrying
out their evaluation. Therefore, the tools available for
this purpose were studied. We then proceeded to a
more practical part: the identification and
implementation of mobile app software features that
could be an asset in the comparison and evaluation of
the development frameworks. Then, the design and
development of the testing process was one of the
most important points, taking into account that it was
where most of the related works analysed showed
failures. Finally, the process ended by executing all
the designed specific tests, performing a statistical
evaluation on the results obtained and providing some
discussion on these.
3.1 Release Mode
The term “Release Mode” was mentioned in an
official Apple lecture (Apple Inc. 2015) when
describing their performance analyser tool known as
Instruments. From here, and due to the lack of other
works considering this issue, as well as the
inconsistency of tools and metrics that were used, we
incorporated the need of having this executable mode
either in the proposed approach and in the tools used
to assess mobile application performance.
3.2 Frameworks Selection
Bearing in mind that the defined objective involves
the comparison of applications developed using
distinct frameworks, we also tried to compare
multiplatform frameworks that have a larger market
share among the developers community today. In this
way, the selection of the frameworks was based on