Improving Clarity and Completeness in User Stories:
Insights from a Multi-Domain Analysis with Developer Feedback
Maria Regina Ara
´
ujo Souza
a
and Tayana Conte
b
Federal University of Amazonas, Manaus, Brazil
{maria.souza, tayana@icomp.ufam.edu.br
Keywords:
Software Requirements, User Stories, Agile Methodologies, Requirements Analysis.
Abstract:
The clarity and completeness of requirements are crucial in agile software development, where user stories are
widely used to capture user needs. However, poorly written user stories can introduce ambiguities, leading to
inefficiencies in the development process. This paper presents a detailed analysis of 30 user stories from five
different domains, along with feedback from 50 developers gathered through a questionnaire. The analysis,
based on the INVEST criteria (Independent, Negotiable, Valuable, Estimable, Small, and Testable), identified
common issues such as vague acceptance criteria, insufficient technical details, and overly broad stories. Based
on these findings, we present targeted recommendations for improving user story quality, including refining
acceptance criteria, breaking down large stories into smaller components, and incorporating adequate technical
details. The feedback from developers reinforced the value of these practices, highlighting the importance of
collaboration in refining user stories. This study offers actionable insights and practical strategies to enhance
user story quality and promote continuous improvement in agile software development.
1 INTRODUCTION
In agile software development, user stories are widely
used to capture requirements concisely and facilitate
communication between development teams, stake-
holders, and clients (Cohn, 2004; Leffingwell, 2010).
They help align software features with business objec-
tives and user needs (Williams and Cockburn, 2003).
However, despite their advantages, many user sto-
ries lack clarity, well-defined acceptance criteria, and
technical details, leading to inefficiencies in agile
projects (Inayat et al., 2015; Lucassen et al., 2016).
Ambiguous user stories contribute to miscom-
munication, rework, and project delays (Lucassen
et al., 2015; Heck and Zaidman, 2018). Poorly struc-
tured stories hinder development efficiency and com-
promise stakeholder satisfaction (Leffingwell, 2018).
Given their importance in agile methodologies, im-
proving the clarity and completeness of user stories
is essential to reducing development challenges and
enhancing project outcomes.
This paper presents an in-depth analysis of
30 user stories from five distinct domains—E-
commerce, Billing, Security, Collaboration Tools,
a
https://orcid.org/0000-0001-9705-3894
b
https://orcid.org/0000-0001-6436-3773
and E-learning. Using the INVEST criteria (Inde-
pendent, Negotiable, Valuable, Estimable, Small, and
Testable), we identify common deficiencies and pro-
pose actionable improvements. These include re-
fining acceptance criteria, ensuring sufficient techni-
cal details, and breaking down complex stories into
smaller, more manageable tasks.
Additionally, to validate these findings, we con-
ducted a developer survey with 50 participants, gath-
ering insights into real-world challenges when work-
ing with user stories. The objective of this research
is to provide practical recommendations that support
developers in writing clearer, more detailed, and ac-
tionable user stories, contributing to improved agile
development practices.
2 RELATED WORK
Several studies have analyzed the quality of user sto-
ries and their impact on agile development. Wake
(2003) introduced the INVEST criteria, which serve
as a guideline for writing effective user stories by en-
suring they are actionable, valuable, and testable. De-
spite these guidelines, research indicates that user sto-
ries frequently fail to meet these standards in practice.
272
Souza, M. R. A. and Conte, T.
Improving Clarity and Completeness in User Stories: Insights from a Multi-Domain Analysis with Developer Feedback.
DOI: 10.5220/0013363500003929
Paper published under CC license (CC BY-NC-ND 4.0)
In Proceedings of the 27th International Conference on Enterprise Information Systems (ICEIS 2025) - Volume 2, pages 272-279
ISBN: 978-989-758-749-8; ISSN: 2184-4992
Proceedings Copyright © 2025 by SCITEPRESS Science and Technology Publications, Lda.
Lucassen et al. (2016, 2015) identify recurring is-
sues such as vague acceptance criteria, lack of techni-
cal details, and oversized stories. These deficiencies
result in ambiguities and inefficiencies during devel-
opment. To address these challenges, Lucassen et al.
(2016) proposed the Quality User Story Framework
(QUS), which aims to improve clarity and consis-
tency. Empirical studies suggest that applying QUS
enhances alignment between stakeholder expectations
and development outcomes.
Inayat et al. (2015) conducted a systematic review
of agile requirements engineering practices and found
that unclear user stories and insufficient details fre-
quently lead to project delays and misaligned expec-
tations. Similarly, Heck and Zaidman (2018) high-
light the importance of testability and consistency in
reducing ambiguities and improving team collabora-
tion.
From a broader perspective, Leffingwell (2010)
discusses the role of user stories in large-scale agile
projects, advocating for structured approaches such
as the Scaled Agile Framework (SAFe). Meanwhile,
Williams and Cockburn (2003) emphasize the impor-
tance of continuous feedback in refining user stories
to align with evolving user needs.
This study builds upon these works by not only
identifying recurring issues in user stories but also
validating them through direct developer feedback.
Our approach provides actionable recommendations
based on real-world challenges, offering practical in-
sights to enhance user story quality in agile software
development.
3 RESEARCH METHODOLOGY
This study employed a mixed-methods approach
combining qualitative and quantitative methods to
evaluate user story quality in agile development. The
research was conducted in three phases:
Data Collection: User stories were gathered from
publicly available repositories across five domains:
E-commerce, Billing, Security, Collaboration Tools,
and E-learning. The selection ensured diversity
in complexity, application contexts, and functional
scope.
User Story Evaluation: Stories were assessed us-
ing the INVEST criteria (Independent, Negotiable,
Valuable, Estimable, Small, and Testable), identify-
ing common issues such as unclear acceptance crite-
ria, lack of technical details, and overly broad scopes.
Developer Feedback Validation: A developer
questionnaire was designed to collect insights on real-
world challenges related to user stories, including
clarity, technical specifications, and acceptance crite-
ria. The questionnaire was shared in private corporate
networks of companies focused on software develop-
ment—particularly web development—, on LinkedIn,
and was also distributed through professional con-
nections, leveraging a network of developers actively
working in agile environments. The feedback gath-
ered helped refine the recommendations for improv-
ing user story quality.
Figure 1 illustrates this methodology, emphasiz-
ing the interconnection of these phases and the bal-
ance between theoretical analysis and practical devel-
oper insights.
Figure 1: Methodology Overview.
3.1 User Story Analysis
We conducted an extensive analysis of 30 user stories
from publicly available repositories, covering various
domains such as e-commerce, billing, security, col-
laboration tools, and e-learning. The complete dataset
source is available in an external document: Support-
ing Documentation for User Story Analysis.
Each user story was evaluated not only using the
INVEST criteria, but also through a more developer-
centric perspective. While the INVEST framework
provides a structured guideline for user story quality,
our analysis revealed that even stories that adhered to
these principles still lacked key elements from a de-
veloper’s standpoint.
To capture these real-world challenges, we incor-
porated the perspective of John Smith, a full-stack
developer persona. This approach highlighted addi-
tional critical aspects often overlooked in standard
user story evaluations, such as missing technical de-
tails, unclear problem descriptions, and insufficient
actionable value. While these stories were gener-
ally aligned with INVEST principles, the developer’s
perspective underscored gaps that could lead to inef-
ficiencies, misunderstandings, and additional imple-
mentation effort.
Improving Clarity and Completeness in User Stories: Insights from a Multi-Domain Analysis with Developer Feedback
273
3.2 Developer Feedback Questionnaire
To validate the findings from the user story analysis,
we designed a questionnaire targeting 50 developers
across various domains. The goal was to identify real-
world challenges in user story usage and gather in-
sights for improvement.
The questionnaire was divided into multiple sec-
tions, each focusing on key aspects of user story qual-
ity:
Table 1: Key Aspects of User Story Quality Assessed.
Aspect Description
Clarity Assesses whether user sto-
ries avoid vague descrip-
tions or ambiguous goals.
Acceptance Criteria Evaluates if acceptance
criteria are explicitly
defined and sufficient for
validation.
Technical Details Checks whether stories
provide necessary tech-
nical details, such as
API references and data
formats.
Improvements Gathers developer feed-
back on enhancing clar-
ity, technical complete-
ness, and usability.
The questionnaire was distributed via private com-
pany networks, LinkedIn agile communities, and re-
ferrals from professionals in software development.
The complete questionnaire is available at: Support-
ing Documentation for User Story Analysis.
4 RESULTS AND DISCUSSION
4.1 User Story Analysis
To ensure a comprehensive analysis, the 30 collected
user stories were organized into five domains: E-
commerce, Billing, Security, Collaboration Tools,
and E-learning. This categorization facilitated the
identification of domain-specific challenges, particu-
larly those related to clarity, technical details, and ac-
ceptance criteria.
The primary analysis was conducted from the per-
spective of the persona John Smith, a full-stack de-
veloper with five years of experience. This approach
provided practical insights into how developers inter-
pret and implement user stories, revealing real-world
challenges such as ambiguous descriptions, missing
technical details, and unclear acceptance criteria.
To complement this analysis, the INVEST crite-
ria were used as a secondary framework to system-
atically assess user story structure and completeness.
While many stories aligned with INVEST principles,
gaps remained in technical details and clarity from
a developer’s standpoint. The full analysis, includ-
ing detailed evaluations and persona documentation,
is available at: Supporting Documentation for User
Story Analysis.
A summary of the key findings from the user story
analysis is presented in Table 2, highlighting the most
relevant challenges identified within each domain.
Table 2: Summary of User Stories Analysis Across Do-
mains.
User Story and Do-
main
Key Observations
E-commerce: User
Story 1.1 - ”As a
visitor, I want to
view a selection of
shirts to explore my
options.
Lacked clear filtering cri-
teria and UI specifications.
Needed structured accep-
tance criteria.
Billing: User Story
2.1 - ”As a user, I
want to create, up-
date, and delete a
subscription.
Combined multiple ac-
tions, making estimation
difficult. Required break-
down into smaller tasks.
Security: User
Story 3.1 - ”As a
security engineer,
I want to verify
the system config-
uration to ensure
secure settings.
Lacked security bench-
marks and validation tools.
Needed supplementary
documentation.
E-learning: User
Story 4.1 - ”As
a student, I want
to practice gram-
mar exercises to im-
prove my language
skills.
Lacked content format
specification (multiple-
choice vs. fill-in-the-
blank). Required clearer
acceptance criteria.
4.1.1 Domain-Specific Challenges and
Observations
Across the five domains analyzed, common chal-
lenges were identified, making it difficult for devel-
opers like John Smith to implement them effectively.
Table 3 summarizes the key findings and recommen-
dations from the analysis.
ICEIS 2025 - 27th International Conference on Enterprise Information Systems
274
Table 3: Domain-Specific Challenges and Suggested Im-
provements.
Identified Chal-
lenges
Recommended Improve-
ments
Lack of Clear Ac-
ceptance Criteria
Define explicit, testable
criteria to ensure alignment
with development expecta-
tions.
Overly Complex
User Stories
Break down large stories
into smaller, independent
tasks for better estimability
and clarity.
Insufficient Techni-
cal Details
Include essential specifi-
cations such as API ref-
erences, security require-
ments, and UI/UX guide-
lines.
Broad and Ambigu-
ous Functionalities
Ensure stories focus on a
single functionality to pre-
vent excessive scope and
implementation issues.
4.2 Discussion of Key Findings
From the detailed analysis of the user stories across
these domains, several recurring patterns were ob-
served:
1. Clear Acceptance Criteria: Many stories lacked
well-defined acceptance criteria, making it diffi-
cult for developers to determine when the story
was complete. This issue was particularly evident
in User Story 2.1, where no explicit validation
criteria were provided for subscription manage-
ment actions. Providing detailed acceptance cri-
teria would reduce ambiguity and streamline both
development and testing processes.
2. Breaking Down Complex Stories: A significant
number of stories were overly complex, attempt-
ing to cover multiple functionalities within a sin-
gle story. For example, in the Billing domain, a
single story combined actions such as creating,
updating, and deleting subscriptions, making es-
timation difficult. Breaking these stories into in-
dependent components would improve clarity and
estimability.
3. Inclusion of Technical and UI/UX Details:
Many user stories lacked the necessary techni-
cal details or UI/UX specifications. For exam-
ple, in the E-learning domain, the User Story 4.1,
a story about grammar exercises did not specify
whether they should be multiple-choice or fill-
in-the-blank questions. Including such details in
user stories would provide clearer guidance for
developers and ensure alignment with user expec-
tations.
4. Alignment with INVEST Criteria: While many
user stories adhered to aspects of the INVEST cri-
teria, this alone did not ensure they were fully
prepared for development. Stories often met the
structural requirements but still lacked critical el-
ements such or contextual information needed for
implementation. This highlights that while IN-
VEST serves as a useful guideline, additional re-
finement is necessary to make user stories truly
actionable for developers, reducing ambiguity and
facilitating a smoother development process.
4.3 Recommendations for Improvement
Based on these findings, the following recommenda-
tions can help improve user story quality in agile de-
velopment:
Develop Clear and Testable Acceptance Cri-
teria: Each user story should define explicit,
testable conditions for completion. This would re-
duce ambiguity and ensure that developers know
exactly what is expected.
Break Down Large Stories into Smaller, Inde-
pendent Tasks: By dividing complex stories into
smaller, manageable units, teams can improve the
clarity and estimability of stories, reducing cogni-
tive load during development sprints.
Include Technical and UI/UX Specifications:
Where relevant, user stories should include or
reference technical documentation, such as API
specifications or database schemas, and provide
UI/UX mockups to ensure consistency in imple-
mentation.
These recommendations align with previous re-
search on the importance of clear requirements and
practical user story frameworks in improving the ef-
ficiency and effectiveness of agile development pro-
cesses.
4.4 Developer Feedback
The developer questionnaire provided key insights
into the common challenges faced when working with
user stories. The results indicate that 96.4% of devel-
opers have encountered issues with clarity or speci-
ficity in user stories, often leading to misunderstand-
ings and rework. Additionally, 67.8% of respon-
dents reported that user stories frequently lack essen-
tial technical details, such as data formats, API refer-
ences, or tool-specific guidelines.
Improving Clarity and Completeness in User Stories: Insights from a Multi-Domain Analysis with Developer Feedback
275
A critical issue identified was the absence of well-
defined acceptance criteria, with 85.7% of developers
strongly agreeing on their importance for determining
when a story is complete. Developers also highlighted
vague acceptance criteria (71.4%) and lack of clarity
(69.6%) as the most significant challenges, followed
by dependencies between stories (58.9%) and insuffi-
cient technical details (51.8%).
Nearly 95% of respondents agreed that including
practical examples would improve user story compre-
hension and implementation, helping to clarify ex-
pected behaviors and avoid misinterpretation.These
insights, along with a detailed breakdown of devel-
oper responses, are summarized in Table 4.
Table 4: Summary of Developer Feedback on User Stories.
Question Responses
Have you ever
faced difficulties
with the clarity or
specificity of user
stories?
Yes: 96.4%
No: 3.6%
Do user stories gen-
erally include suffi-
cient technical de-
tails?
Strongly agree: 1.8%
Agree: 30.4%
Disagree: 60.7%
Strongly disagree: 7.1%
Are clear ac-
ceptance criteria
essential for effec-
tive user stories?
Strongly agree: 85.7%
Agree: 14.3%
What are the
biggest challenges
when working with
user stories?
Vague acceptance criteria:
71.4%
Lack of clarity: 69.6%
Lack of technical details:
51.8%
Dependencies between
stories: 58.9%
No user value: 1.8%
Breaking down large sto-
ries: 1.8%
Would practical
examples improve
the understanding
of user stories?
Strongly agree: 50%
Agree: 44.6%
Disagree: 5.4%
These findings reinforce the need for clearer ac-
ceptance criteria, improved technical details, and the
inclusion of practical examples to enhance the quality
of user stories. The complete questionnaire results,
along with detailed developer responses, are available
in the technical document: Supporting Documenta-
tion for User Story Analysis.
5 CRITICAL ANALYSIS
The findings from the user story analysis and de-
veloper feedback revealed recurring issues in agile
development, including vague descriptions, insuffi-
cient technical details, and unclear acceptance crite-
ria. These challenges often cause delays, miscom-
munication, and rework, limiting the effectiveness of
user stories despite frameworks like INVEST.
To address these issues, a Practical Guide for
Writing Effective User Stories was developed. The
guide highlights three essential practices:
Defining Clear Acceptance Criteria: Ensuring
user stories include specific, measurable, and testable
criteria for completion. Providing Technical Details:
Including relevant specifications such as API end-
points, data formats, and expected system behavior.
Breaking Down Complex Stories: Dividing large sto-
ries into smaller, independent tasks for easier imple-
mentation and estimation. These practices bridge the-
oretical principles and real-world development needs,
enabling teams to improve user story quality and pro-
mote collaboration.
Figure 2 summarizes these recommendations,
serving as a reference for teams aiming to enhance
their requirements management process.
Figure 2: Key areas of the Practical Guide for Writing Ef-
fective User Stories.
5.1 Clarity and Objectivity
One of the most prominent issues identified is the lack
of clarity in user stories, as reported by over 90% of
developers surveyed. Vague or ambiguous user sto-
ries lead to misinterpretations, resulting in rework and
delays. For user stories to be effective, they must be
written in a clear and straightforward manner, acces-
sible to all stakeholders, including developers, testers,
and product owners. This is particularly crucial in
complex domains, such as e-commerce and security,
where a lack of clarity can cause significant disrup-
tion.
ICEIS 2025 - 27th International Conference on Enterprise Information Systems
276
To ensure clarity, user stories should be written
with simple language and free of unnecessary tech-
nical jargon. Every term used should be well under-
stood by all team members involved, and the goal of
the story should be sharply focused on addressing a
specific user need.
Example of a Clear User Story:
“As a registered user, I want to receive an email noti-
fication after placing an order, so that I can track my
purchase status.
This example avoids ambiguity, making the re-
quired functionality easy to understand and imple-
ment, ensuring smooth communication between all
involved parties.
5.2 Breaking Down Large Stories
Overly complex user stories that attempt to cover
multiple actions—such as creation, update, and dele-
tion in one—were reported by developers as diffi-
cult to estimate and test. These complex stories can
overwhelm developers and result in missed deadlines
or incomplete implementations. A recommended
approach is to break down these large stories into
smaller, independent tasks that deliver incremental
value to the user, in line with the INVEST principles.
By ensuring that each smaller story remains es-
timable and testable within a sprint, the workflow
improves, leading to more predictable progress and
fewer bottlenecks in development.
Example of a Simplified Breakdown:
Instead of:
“As an admin, I want to manage users, including cre-
ating, updating, and deleting accounts.
Break it down into:
- “As an admin, I want to create new user accounts
to manage system access. - “As an admin, I want
to update user accounts to reflect new information.
- “As an admin, I want to delete user accounts when
necessary.
This division allows each functionality to be de-
veloped and tested separately, making the process
smoother and more efficient.
5.3 Defining Clear Acceptance Criteria
Another major issue highlighted by developers was
the lack of well-defined acceptance criteria, with over
90% agreeing that this absence causes confusion dur-
ing development. Without clear criteria, developers
and testers struggle to determine when a user story is
complete, leading to inconsistent results and potential
delays.
Acceptance criteria must be objective, specific,
and testable. They should define clear conditions that
indicate when the functionality has been successfully
implemented.
Example of Acceptance Criteria:
The system must send a confirmation email upon
successful order completion.
The email must include the order number and pur-
chase details.
If the payment fails, the user must receive a noti-
fication with instructions to retry.
By including precise acceptance criteria like
these, all stakeholders gain a shared understanding of
what constitutes a successful implementation, reduc-
ing the risk of ambiguity or misinterpretation.
5.4 Including Technical Details
One of the most consistent findings from the devel-
oper feedback was the frequent lack of technical de-
tails in user stories. More than 75% of developers
cited that user stories often did not include enough
technical context—such as data formats, API details,
or security requirements—resulting in delays and re-
work.
User stories should strike a balance between fo-
cusing on the user’s needs and providing developers
with the technical details they require for implemen-
tation. Where necessary, stories should reference sup-
plementary technical documents, such as API docu-
mentation or database schemas, to provide clarity on
implementation requirements.
Example:
If a user story involves integrating with an external
API, it should include details such as expected data
formats, HTTP methods, and response codes, or ref-
erence a technical document outlining these aspects.
5.5 Using Practical Examples
Nearly all developers (98%) agreed that practical ex-
amples significantly enhance their understanding of
user stories. By providing clear examples that illus-
trate expected behaviors in different scenarios, devel-
opers can better grasp the requirements and avoid am-
biguity during implementation.
Example:
“If the user selects express shipping, they should see
an estimated delivery time within 24 hours.
Including practical examples like this provides
concrete reference points for developers and helps
align their work with stakeholder expectations.
Improving Clarity and Completeness in User Stories: Insights from a Multi-Domain Analysis with Developer Feedback
277
5.6 Proposing Practical Solutions
The analysis highlights several key areas where im-
provements in the writing and structuring of user sto-
ries can directly address the challenges identified.
Focusing on the following strategies can help agile
teams develop clearer, more actionable stories:
Prioritize Clarity: Write user stories using
straightforward language, avoiding technical jar-
gon where possible, and clearly defining the
story’s objective.
Break Down Complex Stories: Divide large,
complex stories into smaller, independent units
that can be completed and tested within a sprint.
Define Specific Acceptance Criteria: Ensure ev-
ery user story includes clear, testable criteria that
stakeholders can use to confirm completion.
Include Relevant Technical Details: Provide
or reference all necessary technical information,
such as API documentation, database schemas, or
flowcharts, to ensure developers have the informa-
tion they need.
Use Practical Examples: Incorporate examples
into user stories to clarify how the system should
behave in different scenarios.
By focusing on these improvements, agile teams
can significantly reduce misunderstandings, improve
efficiency, and deliver higher-quality software that
better aligns with stakeholder expectations. These
practical solutions reflect the combined insights of
both the user story analysis and developer feedback,
offering a path toward more effective user stories and
smoother development processes.
6 CONCLUSION AND FUTURE
WORK
This study has underscored the critical importance of
improving user story writing practices in agile soft-
ware development. The findings reveal that, while
frameworks such as INVEST provide solid guide-
lines, their practical implementation often falls short
in areas like clarity, well-defined acceptance criteria,
and sufficient technical details. These shortcomings
lead to confusion and inefficiencies that directly im-
pact the development process.
The research conducted through both the user
story analysis and developer feedback has highlighted
several key areas of improvement:
Clarity and Simplicity: User stories must be
written in clear, accessible language that avoids
ambiguity, ensuring all stakeholders understand
the requirements.
Breaking Down Large Stories: Large, complex
stories should be divided into smaller, manageable
units, each with independent value, to improve es-
timability and facilitate smooth sprint planning.
Clear Acceptance Criteria: Explicit and testable
acceptance criteria should be defined for each
story, providing clear guidelines for developers
and testers to know when the work is considered
complete.
Inclusion of Technical Details: User stories
should include or reference the necessary techni-
cal specifications—such as data formats, API doc-
umentation, and security requirements—to reduce
ambiguity and streamline the implementation pro-
cess.
Use of Practical Examples: Incorporating exam-
ples of expected outcomes, edge cases, and com-
mon scenarios helps clarify functionality and re-
duces miscommunication during implementation.
The suggestions put forth aim to create more ac-
tionable and reliable user stories, ultimately lead-
ing to better alignment with stakeholder expectations
and improved development efficiency. The inclu-
sion of additional supporting documentation, such
as flowcharts, UI/UX prototypes, and technical dia-
grams, was also recommended by developers to fur-
ther enhance story clarity.
For future work, we highlight two key areas for
further exploration. First, validating these recommen-
dations through real-world case studies would provide
empirical evidence of their effectiveness in improving
user story quality. By applying the proposed improve-
ments in live software projects and analyzing their im-
pact on development efficiency and stakeholder satis-
faction, teams can better understand the practical ben-
efits of enhanced user story practices.
Second, an interesting avenue for future research
is leveraging AI tools to assist in refining and ensur-
ing the quality of user stories. AI-driven approaches
could help identify ambiguities, suggest refinements,
and generate acceptance criteria automatically, reduc-
ing cognitive load for development teams and promot-
ing more consistent story writing practices.
The quality of user stories has a direct impact on
the success of agile software development. This study
has demonstrated that clear language, well-defined
acceptance criteria, sufficient technical details, and
active collaboration between teams are essential el-
ements for creating effective user stories. By imple-
menting the practices recommended in this research,
development teams might expect improved efficiency,
ICEIS 2025 - 27th International Conference on Enterprise Information Systems
278
fewer misunderstandings, and higher-quality software
deliveries that meet stakeholder expectations.
As teams evolve their practices, user stories will
become an even more powerful tool for managing
requirements and ensuring smooth communication
throughout the development process. Continued stud-
ies and experimentation will further refine the pro-
cess of writing user stories, contributing to stronger
alignment between technical teams and business ob-
jectives.
ACKNOWLEDGEMENTS
Our gratitude goes to the participants of this study,
whose engagement and insights were fundamental
to this research. We would like to thank the fi-
nancial support granted by CNPq (314797/2023-8;
443934/2023-1; 445029/2024-2).
REFERENCES
Cohn, M. (2004). User Stories Applied: For Agile Software
Development. Addison-Wesley Professional.
Heck, P. and Zaidman, A. (2018). A systematic literature
review on quality criteria for agile requirements spec-
ifications. Software Quality Journal, 26(1):127–160.
Inayat, I., Salim, S. S., Marczak, S., Daneva, M., and
Shamshirband, S. (2015). A systematic literature re-
view on agile requirements engineering practices and
challenges. Computers in Human Behavior, 51:915–
929.
Leffingwell, D. (2010). Agile Software Requirements: Lean
Requirements Practices for Teams, Programs, and the
Enterprise. Addison-Wesley Professional.
Leffingwell, D. (2018). SAFe 4.5 Reference Guide: Scaled
Agile Framework for Lean Enterprises. Addison-
Wesley Professional.
Lucassen, G., Dalpiaz, F., van der Werf, J. M. E., and
Brinkkemper, S. (2015). Forging high-quality user
stories: Towards a discipline for agile requirements.
In 2015 IEEE 23rd International Requirements Engi-
neering Conference (RE), pages 126–135. IEEE.
Lucassen, G., Dalpiaz, F., van der Werf, J. M. E., and
Brinkkemper, S. (2016). Improving agile require-
ments: the quality user story framework and tool. Re-
quirements Engineering, 21(3):383–403.
Wake, B. (2003). Invest in good stories, and smart tasks.
Williams, L. and Cockburn, A. (2003). Agile software de-
velopment: it’s about feedback and change. Com-
puter, 36(6):39–43.
Improving Clarity and Completeness in User Stories: Insights from a Multi-Domain Analysis with Developer Feedback
279