study. This may serve as a tool for a future standard-
ization in the field of data quality evaluation.
The evidence provided by the selected studies also
showed that the author´s main concern while doing a
data quality evaluation of a dataset is not the appli-
cation domain of those information but the analysis
itself according to the dimensions that are being used.
We also noticed that Human Observation is the ap-
proach used by most of the articles, and we believe
that´s why this approach is effective and easiest than
a semi-automatic or automatic data analysis. None
of the selected articles reported the use of a machine
learning algorithm while analysing the datasets. Data
non conformities that were cited was related most
with data discoverability and usability. Dimensions
related with the form which data is presented to the
user (Simplicity) and how the user can interact with
the dataset (Interaction) were cited by most of the ar-
ticles and should be considered while making an open
government data available to the public.
There are some threats to validity in our study.
First, our research questions may not encompass a
full study of the current state of the art of quality
evaluation of open government data available on the
web. We use the GQM approach to better define the
study objective and research questions. It is possible
that the search strings we use do not allow the iden-
tification of all studies in the area. We mitigate this
threat by expanding the number of electronic reposi-
tories searched to three. All repositories used are spe-
cific of the area of Computing. We cannot guarantee
that all relevant primary studies available in electronic
repositories have been identified. Some relevant stud-
ies may not have been covered by search strings. We
mitigate this threat by using alternative search terms
and synonyms of major terms in search strings. Each
searched electronic repository has its own search pro-
cess and we don’t know how they work or if they
work identically. We mitigate this by adapting the
search string for each electronic repository and as-
sume that equivalent logical expressions work con-
sistently across all electronic repositories used. The
studies were selected according to the defined inclu-
sion, exclusion and quality criteria, but under our
judgment. Thus, some studies may have been selected
or not selected incorrectly.
REFERENCES
Basili, V. R. and Rombach, H. D. (1988). The tame
project: Towards improvement-oriented software en-
vironments. IEEE Transactions on software engineer-
ing, 14(6).
Batini, C., Cappiello, C., and Francalanci (2009). Method-
ologies for data quality assessment and improvement.
ACM computing surveys (CSUR), 41(3).
Behkamal, B., Kahani, M., Bagheri, E., and Jeremic, Z.
(2014). A metrics-driven approach for quality assess-
ment of linked open data. Journal of theoretical and
applied electronic commerce research, 9.
Belhiah, M. and Bounabat, B. (2017). A user-centered
model for assessing and improving open government
data quality.
Brandusescu, A., Iglesias, C., Robinson, K., Alonso, J. M.,
Fagan, C., Jellema, A., and Mann, D. (2018). Open
data barometer: global report.
Chu, P.-Y. and Tseng, H.-L. (2016). A theoretical frame-
work for evaluating government open data platform.
In Proceedings of the International Conference on
Electronic Governance and Open Society: Challenges
in Eurasia.
Clapton, J., Rutter, D., and Sharif, N. (2009). Scie system-
atic mapping guidance. London: SCIE.
Dahbi, K. Y., Lamharhar, H., and Chiadmi, D. (2018). Ex-
ploring dimensions influencing the usage of open gov-
ernment data portals. In Proceedings of the 12th Inter-
national Conference on Intelligent Systems: Theories
and Applications.
Dander, V. (2014). How to gain knowledge when data are
shared? open government data from a media pedagog-
ical perspective. In Seminar. net, volume 10.
Dyb
˚
a, T. and Dingsøyr, T. (2008). Empirical studies of agile
software development: A systematic review. Informa-
tion and Software Technology, 50(9).
Group, O. G. W. (2007). Eight principles of open govern-
ment data. https://opengovdata.org/. Accessed in 20-
March-2022.
Kassen, M. (2013). A promising phenomenon of open data:
A case study of the chicago open data project. Gov-
ernment information quarterly, 30(4).
Kubler, S., Robert, J., Neumaier, S., Umbrich, J., and
Le Traon, Y. (2018a). Comparison of metadata qual-
ity in open data portals using the Analytic Hierarchy
Process. Government Information Quarterly, 35(1).
Kubler, S., Robert, J., Neumaier, S., Umbrich, J., and
Le Traon, Y. (2018b). Comparison of metadata qual-
ity in open data portals using the analytic hierarchy
process. Government Information Quarterly, 35(1).
Kubler, S., Robert, Y., Umbrich, J., and Neumaier, S.
(2016). Open data portal quality comparison using
ahp. In Proceedings of the 17th international digi-
tal government research conference on digital govern-
ment research.
Ku
ˇ
cera, J., Chlapek, D., and Ne
ˇ
cask
`
y, M. (2013a). Open
government data catalogs: Current approaches and
quality perspective. In International conference on
electronic government and the information systems
perspective. Springer.
Ku
ˇ
cera, J., Chlapek, D., and Ne
ˇ
cask
`
y, M. (2013b). Open
government data catalogs: Current approaches and
quality perspective. In International conference on
electronic government and the information systems
perspective. Springer.
WEBIST 2022 - 18th International Conference on Web Information Systems and Technologies
224