Enhancing User Experience in e-Government: A Deep Dive into
e-Government Forms and Citizen Perceptions
Asma Aldrees
1,2 a
and Denis Gra
ˇ
canin
1 b
1
Department of Computer Science, Virginia Tech, Blacksburg, VA 24060, U.S.A.
2
College of Computer Science, King Khalid University, Abha 61421, Saudi Arabia
Keywords:
User Experience, e-Government, e-Forms, User Interface Design, Quantitative Research, Empirical Study,
User Experience Questionnaire, Citizen Perception.
Abstract:
Understanding citizens’ perceptions is essential for improving e-government services and strengthening the
relationship between citizens and the government. Therefore, this study focuses on the design principles
of e-government forms and their impact on citizens’ experiences. Specifically, it examines the context of
e-government in the United States and seeks to understand citizens’ perceptions. A web prototype for e-
government forms was developed based on the US Web Design System (USWDS) guidelines. A web-based
survey using a user experience questionnaire was conducted, with ve scales: efficiency, trust, trustworthiness
of content, quality of content, and clarity. Then, we recruited 200 US citizens to evaluate the implemented
e-form. The results indicated positive user experiences across all scales. However, the trust scale received
the lowest score, despite being considered the most important by citizens. Participants recognized the im-
portance of trust but felt it was not fully established. More research is needed to investigate the trust value
of e-government design principles in the US. By following established design principles and addressing trust
concerns, governments can create user-friendly interfaces that foster trust and meet citizens’ expectations.
1 INTRODUCTION
In the realm of e-government, governments strive to
develop high-quality services that prioritize the qual-
ity of their content and delivery. Service content
refers to the effectiveness and functionality of the ser-
vices provided to fulfill citizens’ needs. Meanwhile,
service delivery encompasses the interaction between
e-government providers and citizens, considering the
mediums or channels employed to facilitate this inter-
action (Al-Besher and Kumar, 2022). The exchange
of information and services in e-government presents
challenges that necessitate a standardized information
exchange process (Umbach and Tkalec, 2022). Elec-
tronic forms play a vital role in enabling this process
within e-government (Scholta et al., 2019). Citizens
benefit from electronic forms as they offer accessi-
bility at any time and from anywhere, eliminating
the need for physical collection efforts. Furthermore,
users can identify and rectify errors prior to submis-
sion, saving time and enhancing access rates to gov-
a
https://orcid.org/0000-0003-0448-9889
b
https://orcid.org/0000-0001-6831-2818
ernment services.
However, the effective management of e-
government forms faces numerous challenges from
various perspectives, including design considera-
tions. Designing excellent and efficient e-government
forms proves to be a formidable task (Scholta et al.,
2020). Efficient form design is crucial in reducing
efforts and minimizing misunderstandings between
governments and citizens.
To ensure the creation of valuable and precise e-
government forms, it is imperative to prioritize the
user experience. User experience encompasses the
emotional, cognitive, and physical reactions that users
have while engaging with a service, and it serves
as a crucial criterion. ISO 9241-210 provides an
explanation of user experience as a comprehensive
concept (ISO 9241-210:2019, 2019). Several ar-
ticles have discussed the technical aspects of user
experience in the e-government domain and eluci-
dated the construction of e-government services (Suk-
masetya et al., 2018; Aldrees and Gra
ˇ
canin, 2021a;
Aldrees and Gra
ˇ
canin, 2021b). In this paper, we con-
duct an analysis of the user experience related to e-
government forms. We utilize the User Experience
250
Aldrees, A. and Gra
ˇ
canin, D.
Enhancing User Experience in e-Government: A Deep Dive into e-Government Forms and Citizen Perceptions.
DOI: 10.5220/0012177700003584
In Proceedings of the 19th International Conference on Web Information Systems and Technologies (WEBIST 2023), pages 250-257
ISBN: 978-989-758-672-9; ISSN: 2184-3252
Copyright © 2023 by SCITEPRESS – Science and Technology Publications, Lda. Under CC license (CC BY-NC-ND 4.0)
Questionnaire (UEQ) as a baseline measurement tool
for a comprehensive and direct assessment of user
experience (Hinderks et al., 2018; Hinderks et al.,
2019). This approach enables us to gauge the accep-
tance of the e-government form design by citizens.
The rest of the article is structured as follows: sec-
tion 2 presents the research background, while sec-
tion 3 describes our research design. We introduce the
results of this user study in section 4. We then discuss
the findings and conclude the paper in section 5.
2 BACKGROUND
2.1 e-Government Forms
Essential for government service delivery, forms are
the official communication channels and the informa-
tion exchange interfaces between government agen-
cies and citizens. They serve two main purposes:
to start the process of receiving a specific govern-
ment service or exchange the necessary informa-
tion with the government. In e-government, elec-
tronic forms refer to specific graphical user interfaces
used to transfer data requests and exchange informa-
tion (Scholta et al., 2019). They are defined as a
“structured interface that provides predefined labeled
spaces for manual data input and is used repeatedly
to transfer data to one or more natural or legal per-
sons” (Scholta et al., 2020).
Recent research has focused on the usability of
forms in web and mobile applications and has pro-
vided specific design guidelines. (Y Yu et al., 2021)
assessed the electronic forms for patients as alterna-
tive to paper forms while (Shcherbyna et al., 2021)
discussed the issues of concluding business contracts
in electronic form. Yet, crafting forms within the
realm of e-government presents a considerable chal-
lenge. Governments need to consider all required reg-
ulations to specify what data they need to capture
from citizens. They also must focus on creating e-
forms that effectively showcase government services.
(Veeramootoo et al., 2018) concluded that the design
of e-forms has a significant impact on citizens’ per-
ceptions of the quality of the offered e-government
services. As previously stated, forms serve as the
main communication bridges connecting citizens and
governments. Therefore, it’s crucial to assess the user
experience of e-forms in order to guarantee the qual-
ity of e-government services.
2.2 User Experience Questionnaire
In the e-government domain, the implementation of e-
government initiatives failed to achieve their assigned
role due to a lack of the offered services, poor website
functions, or lack of users’ trust. A closer look at the
failure aspects indicated how e-government initiatives
drifted by focusing on the technology rather than un-
derstanding users’ needs to adopt e-government ser-
vices (Madariaga et al., 2019). Therefore, the re-
search has started to improve the citizens-government
relationship by evaluating their experience towards
the e-government services to increase their satisfac-
tion. The term user experience (UX) is defined as
“person’s perceptions and responses resulting from
the use and/or anticipated use of a product, system, or
service”, including users’ emotions, beliefs, percep-
tions, preferences, and behaviors that occur before,
during and after using the offered service (ISO 9241-
210:2019, 2019). The concept of UX in e-government
has profound effects on citizens’ satisfaction, percep-
tion, and expectations of government services (Ka-
maruddin and MdNoor, 2017). It aims to intensify
and improve users’ beliefs and emotions toward e-
government services.
It is important to care of all UX aspects to provide
a comprehensive evaluation of the UX for the offered
services. Therefore, the User Experience Question-
naire (UEQ) is an effective quantitative tool to get
valuable feedback from end-users to allow them to
evaluate their feelings while interacting with the of-
fered services and assess their overall UX (Laugwitz
et al., 2008). The UEQ is a widely used question-
naire that represents and measures the most impor-
tant UX aspects with six scales (Attractiveness, Ef-
ficiency, Perspicuity, Dependability, Stimulation, and
Novelty). It aims to allow a quick and precise assess-
ment by end-users to cover a comprehensive impres-
sion of UX. It allows users to express their feelings,
attitudes, and impressions that appear while using and
experiencing the given service or product. Each scale
in the UEQ has four evaluation items that are seman-
tic differentials with a 7-point answer scale (Hinderks
et al., 2018). They consist of a pair of terms with op-
posite meanings that span a semantic dimension. An
example of an item representing the scale efficiency
is:
slow fast
Hence, users rate each item on the 7-point Likert
scale. Then, the answers are scaled from -3 (fully
agree with the negative term) to +3 (fully agree with
the positive term). Half of the items start with the
positive term, and the others with the negative term
(in randomized order).
Enhancing User Experience in e-Government: A Deep Dive into e-Government Forms and Citizen Perceptions
251
However, the UX has different factors with high
relevance not contained in the original UEQ. There-
fore, UEQ+ is a modular extension of the UX ques-
tionnaire scales. It contains a larger list of UX
scales, twenty scales, that comprehensively evaluate
the UX in a given scenario. The UEQ+ is not a UX
questionnaire, it is a tool to build specific question-
naires that are adapted to given scenarios (Schrepp
and Thomaschewski, 2020). Hence, the researcher
can pick the UX scales that are the most relevant for
the specific scenario. In addition, each scale in the
UEQ+ contains a rating concerning the importance of
the scale, as shown below:
I consider the product property described by these
terms as
Completely irrelevant Very important
The importance ratings are used to calculate a key
performance indicator (KPI). Hence, a single number
represents the overall UX of the offered product or
service (Schrepp and Thomaschewski, 2019).
Therefore, in this paper, we have focused on
the trust, quality, and efficiency of the offered e-
government services. Trust in e-government has been
considered a significant direct determinant of adop-
tion behavior towards the offered e-government ser-
vices (Verkijika and De Wet, 2018; Chohan and
Hu, 2020). Therefore, governments need to make
e-government services more trustworthy to increase
the level of adoption behavior. Moreover, efficient
and high-quality services have a great impact to en-
hance the UX of the offered services. Hence, we
have chosen five UX scales to evaluate the UX of e-
government forms:
1 Efficiency: refers to the users’ subjective impres-
sion that they can achieve the goals of using the
offered services with minimal effort. The service
responds quickly to users’ actions.
2 Trust: refers to the users’ impression that their
data entered into the offered service is secured,
safe, and not misused to harm them.
3 Trustworthiness of Content: refers to the users’
impression that the information provided by the
service is reliable and of good quality. The user
has trust in the information provided by the of-
fered service.
4 Quality of Content: refers to the users’ impression
that the information provided by the offered ser-
vice is high-quality, accurate, well-prepared, and
easy to understand. Users are interested to read
the information provided by that service.
5 Clarity: refers to the user’s impression of the de-
sign, structure, and visual complexity of a graph-
ical user interface of the offered service (Schrepp
and Thomaschewski, 2020).
3 RESEARCH METHODOLOGY
3.1 Research Design
In this study, we have designed a web prototype of
an e-government form to understand users’ percep-
tions and evaluate their experience while using this e-
government form. The form was designed to seek cit-
izens’ information about their recent activities using
the e-government services, and also their ratings of
these services, and whether they have any suggestions
or complaints to improve the current e-government
services. The e-government in the US is a case study
in this paper. The United States Web Design Sys-
tem (USWDS) is the federal government design sys-
tem created by the US government in 2015 (US Gov.,
2022). It is a library of guidelines, codes, and tools
that helps government digital teams to share the de-
sign solutions of e-government portals and provide
effective user-centered design practices. This library
supports the design of dozens of US agencies and
nearly 200 portals. Therefore, we have incorporated
the USWDS guidelines as the reference design com-
ponents for our e-government form prototype. We
have successfully implemented the web prototype,
featuring a 4-page e-government form.
This study employed a quantitative approach, uti-
lizing a web-based questionnaire instrument to gather
data. Participants were encouraged to explore the
web prototype form and provide evaluations regard-
ing their experience with the form. Web-based sur-
veys are widely utilized due to their efficiency, cost-
effectiveness, and accuracy in collecting informa-
tion (Nayak and Narayan, 2019). As mentioned
in subsection 2.2, the questionnaire was designed
based on the UEQ+ (User Experience Questionnaire
Plus) framework, incorporating the five scales men-
tioned above to collect empirical data on the user ex-
perience of e-government forms in the US. Each scale
consisted of four measurement items that were incor-
porated to evaluate the specific aspects covered by the
scale and enable participants to express their percep-
tions related to that scale.
3.2 Research Setting
The target population for this research study consists
of individuals who are citizens or legal residents of the
United States and are above 18 years old. To partic-
ipate in the study, individuals are required to confirm
WEBIST 2023 - 19th International Conference on Web Information Systems and Technologies
252
Table 1: Demographic data.
Demographics Category Total Frequency
Sample size 200
Gender Male 106 53%
Female 94 47%
Age 18–30 years old 58 29%
31–40 years old 91 46%
41–50 years old 33 17%
51–60 years old 13 7%
Above 60 years old 5 3%
Education level No formal school 3 2%
High school or less 37 19%
Associate degree 42 21%
Undergraduate degree 56 28%
Graduate degree 62 31%
their nationality and age. Participants will be selected
through random sampling techniques. The total sam-
ple size consisted of approximately 200 participants
who were U.S. citizens or legal residents. The par-
ticipants were recruited using the services of Ama-
zon Mechanical Turk (MTurk), a reliable crowdsourc-
ing platform utilized by researchers to hire workers
who meet specific study criteria (Amazon, nd). As an
incentive for their participation, each participant re-
ceived a payment of $1 USD upon completion of the
assigned task and questionnaire.
Data collection for this study was carried out
using QuestionPro, an online survey software plat-
form (QuestionPro, 2023). To ensure confidentiality,
the data collection process was designed to exclude
participants’ names or any identifying information.
Participants were given approximately 10–15 minutes
to explore the e-government form and then evaluate
their experience based on the five UX scales utilized
in this study. Participation in the study was entirely
voluntary, and respondents had the option to quit the
survey at any point without saving their responses. It
is important to acknowledge the potential influence of
social desirability bias, which may lead participants
to provide less sincere answers if they feel uncom-
fortable disclosing unfavorable opinions. To mitigate
this effect, the survey included a confidentiality clause
assuring participants that all their answers would re-
main completely confidential. Ultimately, a total of
200 valid surveys constituted the final sample size for
this study.
4 RESULTS
4.1 Demographic Data
The demographic distributions of the survey partici-
pants are presented in Table 1. In terms of gender,
the participants showed almost equal participation be-
tween males and females, with 53% males and 47%
females. Regarding the age factor, most of the partic-
ipants are below 50 years old. Around 29% of par-
ticipants are from 18–30 years old while almost half
of the participants are from 31–40 years old (46%),
followed by 41–50 years old group (17%). The age
group from 51–60 years old has only 7% of par-
ticipants, and only 3% of participants are above 60
years old. The most common educational level of the
US participants was graduate degree (31%) followed
by undergraduate degree (28%), then associate de-
gree (21%). Around 19% of participants have a High
school degree or less while only 2% do not get a for-
mal education at school.
4.2 User Experience Analysis
In evaluating the User Experience Questionnaire
(UEQ), we utilized UEQ data analysis tools provided
by (Hinderks et al., 2018). These tools were devel-
oped in the form of a Microsoft Excel file, streamlin-
ing the necessary calculations for the analysis. The
tools not only calculate scale values but also generate
relevant bar charts based on the entered data, visually
Enhancing User Experience in e-Government: A Deep Dive into e-Government Forms and Citizen Perceptions
253
Table 2: Mean and confidence interval per each measurement item.
presenting the results. Additionally, basic statistical
indicators essential for data interpretation are com-
puted by the analysis tool. The advantage of using this
analysis tool is that it simplifies the experimenter’s
work, requiring only the insertion of the collected user
experience data. The analysis tool automatically gen-
erates several figures to visually represent the output
of the entered data.
The analysis results are presented in a ten-column
table (Table 2). The first column, labeled as “scale,
corresponds to the ve selected scales in this study.
The ”item left” column indicates the leftmost items
in the UEQ, such as “slow, “inefficient, “insecure,
“obsolete, and so on. On the other hand, the “Item
right” column represents the rightmost items in the
UEQ, including terms like “fast, “efficient, “accu-
rate, “interesting, and more. The fourth, fifth, and
sixth columns of the table display basic statistical re-
sults, such as the mean, variance, and standard devi-
ation of the entered data. The seventh column repre-
sents the number of participants involved in the study,
with a total of 200 participants. The eighth and ninth
columns indicate the confidence level and confidence
interval for each measurement item, providing addi-
tional insights into the analysis.
During the analysis of the UEQ+ data, the UEQ+
analysis tool generates results displayed in a mean
values table. The range of values measured in the
UEQ spans from -3 to +3. A rating of -3 corresponds
to a significantly negative or ”horribly bad” response,
while 0 represents a neutral or average rating, and
+3 indicates an extremely positive or ”most positive”
response (Hinderks et al., 2018). In the context of
the UEQ table, negative values indicate the leftmost
items, which occur when participants rate the mea-
sured items as 1 to 3. The neutral or average value is
calculated when participants choose to rate the items
as 4. Positive values, on the other hand, indicate the
rightmost items in the UEQ table, which occur when
participants rate the items as 5 to 7.
Table 2 presents the comprehensive results of the
UX assessment using the UEQ+. Among the items,
the highest mean value is observed for the ”incom-
prehensible/comprehensible” item, with a mean of
1.84 (SD = 1.07). This indicates that the majority
of participants provided positive ratings, perceiving
the item as comprehensible. As the ”incomprehensi-
ble/comprehensible” items are categorized under the
quality of content scale, it can be inferred that the
tested prototype is deemed attractive by the partici-
pants in terms of content quality.
On the other hand, the lowest mean value is 1.52
(SD = 1.29) for the “transparent/non-transparent”
item. This indicates that the majority of partici-
pants chose the leftmost item, indicating a perception
of non-transparency in the e-government form. As
the “transparent/non-transparent” items fall under the
trust scale, it can be concluded that the tested proto-
type is generally perceived as trusted by the partici-
pants, given the positive mean value. However, the
lower mean value suggests some trust issues regard-
ing the transparency of the e-government form design,
which should be taken into consideration for further
improvement. These highest and lowest mean values
indirectly provide evidence that the tested prototype
was generally accepted by the participants. It also al-
lows us to identify areas for further improvement in
order to enhance the quality of the prototype. Addi-
WEBIST 2023 - 19th International Conference on Web Information Systems and Technologies
254
Figure 1: Overall mean values of each selected UEQ scale.
tionally, increasing the mean values would strengthen
the reliability and robustness of the results for future
reference.
In addition to the detailed results, the UEQ+ data
analysis tool also generates an overview of the se-
lected UEQ+ scales as displayed in Figure 1 and Fig-
ure 2. These figures provide a visual representation
of the mean values and importance ratings for the se-
lected scales. The results show that the means were
transformed from -3 to +3 and the results were all pos-
itive. Figure 1 displays the mean value of each scale
used in this study, where the mean value on the effi-
ciency scale is 1.73; trust is 1.66; the trustworthiness
of content is 1.74; quality of content is 1.78; clarity
is 1.76. The mean values of all the scales were above
1, which shows an excellent result and a good expe-
rience for the e-government form. The UEQ+ scale
also provides the mean importance ratings and shows
the importance of a scale based on the value, as seen
in Figure 2. The mean importance rating value for the
e-government form indicates that all of the scales are
important, where the average importance rate of the
efficiency scale is 0.99; trust is 1.15; Trustworthiness
of content is 1.05; quality of content is 0.77; clarity is
0.88.
Based on the results, it is evident that the partici-
pant’s experiences with the prototype were generally
positive, as indicated by the positive mean values for
all the measurement items. This suggests that the ma-
jority of participants perceived the prototype to be sat-
isfactory in terms of the UX. However, it is impor-
tant to note the differences in the mean values across
the various items and scales. The presence of lower
mean values for certain items or scales indicates areas
where the prototype could be further improved to pro-
vide a more engaging and satisfying UX. By focusing
on these specific areas, future iterations of the proto-
type can be tailored to address the shortcomings and
enhance the overall UX.
Figure 2: Importance rating of each selected UEQ scale.
5 DISCUSSION AND
CONCLUSION
The quality of government service delivery relies on
the effectiveness of forms, which serve as crucial in-
terfaces for information exchange between the gov-
ernment and citizens. However, forms often pose
challenges due to their complexity. To address this,
governments must ensure that communication chan-
nels with citizens are user-friendly and easy to un-
derstand. To gain insights into citizens’ experiences
with e-government forms, we developed a web proto-
type and conducted a user experience evaluation. The
evaluation was based on the principles of the US Web
Design System (USWDS), which provides guidelines
for designing online government services and user in-
terfaces. Quantitative analysis was employed, utiliz-
ing a web-based questionnaire that incorporated the
User Experience Questionnaire (UEQ+) with five key
UX scales: efficiency, trust, the trustworthiness of
content, quality of content, and clarity. Participants,
who were US citizens or legal residents, were asked
to evaluate the e-government form prototype based on
these scales. Additionally, participants provided rat-
ings on the importance of each scale for the given
form. A total of 200 participants were recruited to
contribute their evaluations, allowing for a compre-
hensive assessment of the e-government form’s user
experience based on the selected UEQ+ scales.
The results indicate that the UX of the designed e-
government form is generally positive across all five
adopted scales. However, it is noteworthy that the
scale of trust has the lowest mean value (1.66) com-
pared to the other scales (Figure 1). This suggests
that there is room for improvement in enhancing the
trustworthiness of the e-government form. Interest-
ingly, the scale of trust also has the highest mean im-
portance value (1.15) among all the scales, as shown
in Figure 2. This highlights the significance partic-
Enhancing User Experience in e-Government: A Deep Dive into e-Government Forms and Citizen Perceptions
255
ipants place on trust in the context of e-government
services. It reinforces the notion that trust is a criti-
cal factor for the success of e-government initiatives.
While the feedback from participants regarding the
trust scale is positive, it is expected that this scale
would have a higher mean value compared to the
other scales. This discrepancy suggests that there may
be specific areas or aspects within the e-government
form where trustworthiness needs to be further em-
phasized and strengthened. By addressing any short-
comings and focusing on enhancing trust in the e-
government forms, it is possible to elevate the overall
UX and promote greater confidence among users.
Trust in e-government services is a critical factor
that significantly influences the adoption and success
of such initiatives. It refers to citizens’ belief in the
reliability and security of the offered e-government
services (Lallmahomed et al., 2017). The impor-
tance of trust in e-government has been widely rec-
ognized worldwide, as citizens are concerned about
the use and security of their personal information.
Numerous studies have emphasized the significance
of trust in driving citizens’ adoption behavior toward
e-government services. Researchers have found that
when users trust the government agency to safeguard
their data, it increases their likelihood of adopting and
using e-government services (Alharbi, 2015; Chohan
and Hu, 2020). Trust has been identified as a direct
determinant of adoption behavior in various studies.
For instance, (Chatzoglou et al., 2015) developed a
framework to identify critical factors influencing the
adoption of e-government services. They highlighted
that trust in the government’s ability to secure citi-
zens’ data plays a crucial role in promoting adoption.
Similarly, (Lee et al., 2019) investigated the factors
influencing citizens’ intention to use e-government
services and found that trust in e-government is a sig-
nificant determinant of adoption behavior.
To enhance citizens’ trust and promote adoption,
it is essential for e-government agencies to implement
secure systems and ensure accurate and trustworthy
interactions with users. The graphical user interface
(GUI) of e-government services plays a crucial role
in establishing citizens’ trust. The design and pre-
sentation of information within these services have a
significant impact on the level of trust users place in
the services and ultimately enhance their overall ex-
perience. By utilizing the UEQ+ questionnaire, it be-
comes possible to gain insights into users’ initial ex-
periences with the designed interface of e-government
forms. The findings, particularly in relation to the
trust scale, raise important considerations for future
research. Specifically, it highlights the need for fur-
ther investigation into the trustworthiness of the de-
sign interface of e-government services in the US,
particularly when applying the USWDS guidelines.
Future studies should delve deeper into the impact
of the USWDS guidelines on the trust factor and ex-
plore how adherence to these guidelines influences
users’ trust in e-government services. By conduct-
ing more extensive investigations, researchers can un-
cover valuable insights that will inform the develop-
ment of trustworthy and user-centric e-government
interfaces, ultimately enhancing citizens’ experiences
and fostering greater trust in the services provided.
REFERENCES
Al-Besher, A. and Kumar, K. (2022). Use of artificial intel-
ligence to enhance e-government services. Measure-
ment: Sensors, 24(100484):1–5.
Aldrees, A. and Gra
ˇ
canin, D. (2021a). Cultural Usability
of E-Government Portals: A Comparative Analysis of
Job Seeking Web Portals Between Saudi Arabia and
the United States. In Soares, M. M., Rosenzweig, E.,
and Marcus, A., editors, Proceedings of the 10th Inter-
national Conference on Design, User Experience and
Usability - DUXU’21, pages 3–17, Cham. Springer In-
ternational Publishing.
Aldrees, A. and Gra
ˇ
canin, D. (2021b). Gender Disparity in
the Usability of E-government Portals: A Case Study
of the Saudi Job Seeking Web Portal. In Proceedings
of the 8th International Annual Conference on Elec-
tronic Governance and Open Society - EGOSE’21,
pages 276—-290, St.Petersburg, Russia. Springer.
Alharbi, A. (2015). The Influence of Trust and subjec-
tive Norms on Citizens’ Intentions to Engage in E-
participation on E-government Websites. In Proceed-
ings of the Australasian Conference on Information
Systems - ACIS’15, volume 113, pages 1–12, Ade-
laide, Australia. AIS eLibrary.
Amazon (n.d.). Amazon Mechanical Turk.
Chatzoglou, P., Chatzoudes, D., and Symeonidis, S. (2015).
Factors affecting the intention to use e-Government
services. In Proceedings of the Federated Conference
on Computer Science and Information Systems - Fed-
CSIS’15, volume 5, pages 1489–1498, Lodz, Poland.
IEEE.
Chohan, S. R. and Hu, G. (2020). Success Factors Influenc-
ing Citizens’ Adoption of IoT Service Orchestration
for Public Value Creation in Smart Government. IEEE
Access, 8:208427–208448. Conference Name: IEEE
Access.
Hinderks, A., Schrepp, M., Dom
´
ınguez Mayo, F. J.,
Escalona, M. J., and Thomaschewski, J. (2019). De-
veloping a UX KPI based on the user experience ques-
tionnaire. Computer Standards & Interfaces, 65:38–
44.
Hinderks, A., Schrepp, M., and Thomaschewski, J. (2018).
User Experience Questionnaire.
ISO 9241-210:2019 (2019). Ergonomics of human-system
WEBIST 2023 - 19th International Conference on Web Information Systems and Technologies
256
interaction — Part 210: Human-centred design for in-
teractive systems.
Kamaruddin, K. A. and MdNoor, N. (2017). Citizen-
centric Demand Model for Transformational Govern-
ment Systems. In Proceedings of the 21st Pacific
Asia Conference on Information Systems - PACIS’ 17,
page 14, Langkawi, Malaysia. Association for Com-
puting Machinery.
Lallmahomed, M. Z., Lallmahomed, N., and Lallmahomed,
G. M. (2017). Factors influencing the adoption of e-
Government services in Mauritius. Telematics and In-
formatics, 34(4):57–72.
Laugwitz, B., Held, T., and Schrepp, M. (2008). Construc-
tion and Evaluation of a User Experience Question-
naire. In Proceedings of the 4th Symposium of the
Workgroup Human-Computer Interaction and Usabil-
ity Engineering of the Austrian Computer Society -
USAB’08, pages 63–76, Graz, Austria. SpringerLink.
Lee, T. D., Park, H., and Lee, J. (2019). Collaborative ac-
countability for sustainable public health: A Korean
perspective on the effective use of ICT-based health
risk communication. Government Information Quar-
terly, 36(2):226–236.
Madariaga, L., Nussbaum, M., Mara
˜
n
´
on, F., Alarc
´
on, C.,
and Naranjo, M. A. (2019). User experience of gov-
ernment documents: A framework for informing de-
sign decisions. Government Information Quarterly,
36(2):179–195.
Nayak, M. S. D. P. and Narayan, K. (2019). Strengths and
weaknesses of online surveys. IOSR Journal of Hu-
manities and Social Sciences, 24(5):31–38.
QuestionPro (2023). QuestionPro Online Survey Software.
Scholta, H., Balta, D., R
¨
ackers, M., Becker, J., and Krcmar,
H. (2020). Standardization of Forms in Governments:
A Meta-Model for a Reference Form Modeling Lan-
guage. Business & Information Systems Engineering,
62(6):535–560.
Scholta, H., Mertens, W., Kowalkiewicz, M., and Becker,
J. (2019). From one-stop shop to no-stop shop: An
e-government stage model. Government Information
Quarterly, 36(1):11–26.
Schrepp, M. and Thomaschewski, J. (2019). A Modular
Extension of the User Experience Questionnaire.
Schrepp, M. and Thomaschewski, J. (2020). Handbook for
the modular extension of the User Experience Ques-
tionnaire. Technical Report 2, Mensch & Computer.
Shcherbyna, V. S., Rieznikova, V. V., Radzyviliuk, V. V.,
Bevz, S. I., and Kravets, I. M. (2021). Problems of
concluding business contracts in electronic form. Lin-
guistics and Culture Review, 5(S2):751–763.
Sukmasetya, P., Santoso, H. B., and Sensuse, D. I. (2018).
Current E-Government Public Service on User Expe-
rience Perspective in Indonesia. In Proceedings of the
International Conference on Information Technology
Systems and Innovation - ICITSI’18, pages 159–164,
Bandung - Padang, Indonesia. IEEE.
Umbach, G. and Tkalec, I. (2022). Evaluating e-governance
through e-government: Practices and challenges of
assessing the digitalisation of public governmen-
tal services. Evaluation and Program Planning,
93(102118).
US Gov. (2022). U.S. Web Design System (USWDS).
Veeramootoo, N., Nunkoo, R., and Dwivedi, Y. K. (2018).
What determines success of an e-government service?
Validation of an integrative model of e-filing con-
tinuance usage. Government Information Quarterly,
35(2):161–174.
Verkijika, S. F. and De Wet, L. (2018). E-government adop-
tion in sub-Saharan Africa. Electronic Commerce Re-
search and Applications, 30:83–93.
Y Yu, J., Goldberg, T., Lao, N., Feldman, B. M., and
Goh, Y. I. (2021). Electronic forms for patient re-
ported outcome measures (PROMs) are an effective,
time-efficient, and costminimizing alternative to paper
forms. Pediatric Rheumatology, 19(67):1–9.
Enhancing User Experience in e-Government: A Deep Dive into e-Government Forms and Citizen Perceptions
257