Comparing Computerized Physician Order Entry Usability
between Expert and Novice Primary Care Physicians
Martina A. Clarke
1
, Jeffery L. Belden
2
and Min Soon Kim
1, 3
1
University of Missouri Informatics Institute, University of Missouri, Columbia, MO, U.S.A.
2
Department of Family and Community Medicine, University of Missouri, Columbia, MO, U.S.A.
3
Department of Health Management and Informatics, University of Missouri, Columbia, MO, U.S.A.
Keywords: Electronic Health Record, Usability, Primary Care.
Abstract: Objectives: To examine usability gaps between expert and novice primary care physicians when using
computerized provider order entry (CPOE). Methods: To analyze usability gaps between ten novice and seven
expert physicians, using the triangular method approach, usability tests involving video analysis were
conducted. Results: While most novice physicians completed tasks less proficiently, and provided a lower
System Usability Scale (SUS) score than expert physicians, the result of ‘percent task success rate’ (t(8) =
2.31, p=0.98) was not significant for both physician groups on all five tasks. Seven common and four unique
usability issues were identified between the two physician groups. Three themes emerged during analysis:
user interface issues, ambiguous terminologies, and training and education issues. Discussion and Conclusion:
This study identified varying usability issues for users of CPOE with different expertise. Two additional
iterations of the usability data collections are undergoing to uncover comprehensive usability issues and
measure the learnability.
1 INTRODUCTION
The use of health information technology (HIT) in
clinical practice is increasing rapidly and more
physicians are using computerized provider order
entry (CPOE) because of the financial incentives
promised by Medicare and Medicaid. CPOEs are
defined as a clinician’s use of computer assistance to
prescribe medication orders from an electronic
device. There is some evidence that the use of CPOE
may cause unintended consequences, such as increase
in clinician work, undesirable workflow issues, and
generation of new kinds of errors (Berger and Kichak,
2004, Ash et al., 2003). Poor usability of CPOEs has
been shown as one of the major factors that leads to
issues, such as reduced efficiency, decreased quality
of patient care, and frustrated clinicians (Khajouei
and Jaspers, 2010, Chan et al., 2011, Kjeldskov et al.,
2010, Neinstein and Cucina, 2011). Usability is
defined in this study as how well users can operate a
system to effectively and efficiently achieve
particular goals with satisfaction (1998). With the
healthcare reform underway, the shortage of primary
care providers, caused by an increase in patients, has
greatly reduced physicians’ time with patients
(2011a). Allowing physicians to quickly complete
required tasks within the CPOE may relieve a part of
the time constraints they experience while treating
patients.
The overall objective of this pilot study is to
compare performances and determine unique and
common usability problems between expert and
novice physicians. Usability gap is being defined as
challenges that users experience when using a system
to complete specific tasks. Our hypothesis is that
expert physicians will encounter less usability issues
and be more efficient than novice physicians when
using the CPOE. If there is no significant difference
between novice and expert physicians then the
usability issues identified is not based on novice
physicians’ inexperience with the system but that
there is room for improvement in the current design
of the CPOE.
2 METHOD
2.1 Study Design
To identify usability gaps in CPOE systems between
304
Clarke M., Belden J. and Kim M..
Comparing Computerized Physician Order Entry Usability between Expert and Novice Primary Care Physicians.
DOI: 10.5220/0005188503040311
In Proceedings of the International Conference on Health Informatics (HEALTHINF-2015), pages 304-311
ISBN: 978-989-758-068-0
Copyright
c
2015 SCITEPRESS (Science and Technology Publications, Lda.)
expert and novice physicians, data collection was
conducted through standard usability tests using
video analysis (Morae®, TechSmith, Okemos, MI),
as eleven family medicine, four internal medicine
first year resident physicians, and one attending
physician completed five artificial, scenarios-based
tasks in a laboratory setting. To examine the usability
gaps between the novice and expert physicians,
quantitative analyses were conducted that included
four sets of performance measures, system usability
scale (SUS) measurement, and subtask analysis. The
usability test lasted for about 20 minutes and was
conducted on a 15 inch laptop using Windows 7.
To maintain consistency and minimize unwanted
distractions, the room consisted of the participant and
the facilitator. The session began with the participant
being reminded that their participation in this study
was completely voluntary and they had the right to
stop the session at any time. The participant was then
instructed to read the printed instructions, containing
a scenario and five tasks, from a binder next to the
laptop. The facilitator sat near the participant to be
available for any questions while supervising the
session. Participants completed the tasks on their own
and the facilitator intervened only if technical issues
arose. This pilot study was reviewed and approved by
the University of Missouri Health Sciences
Institutional Review Board.
2.2 Organizational Setting
University of Missouri Health System (UMHS) is a
536 bed, tertiary care academic medical hospital
located in Columbia, Missouri. UMHS employs more
than 70 primary care physicians at UMHS clinics
throughout central Missouri and had an estimated
553,300 clinic visits in 2012. The Department of
Family and Community Medicine (FCM) manages
six clinics and has over 100,000 patient visits at these
clinics, while the Department of Internal Medicine
(IM) manages two clinics (2011b). The Healthcare
Information and Management Systems Society
(HIMSS), which is a non-profit organization that
tracks how hospitals are adopting electronic medical
record (EMR) application, has awarded UMHS with
Stage 7 of the EMR Adoption Model (2013b, 2011c),
which means the hospital uses electronic patient
charts, incorporates data warehousing to analyze
clinical data, and share data electronically with
authorized health care entities (2011c).
Evaluating usability of a fully implemented
CPOE system within one of the most wired health
care setting makes the goal of this study achievable.
2.3 Participants
Currently there is no evidence-based way to measure
EHR experience. According to the discussion with an
experienced physician champion (JLB) and two chief
residents from both participating departments (FCM,
IM), clinical training level and experience with CPOE
was used to differentiate novice physicians from
expert physicians. First year residents were
categorized as novice users and residents with over
one year experience with current CPOE were
considered expert physicians. Based on an expert’s
experience, being proficient in one EHR does not
make a user proficient in all EHRs. First year
residents as were selected as novice physicians
because they experience more clinical and technical
burdens than any other resident physician group.
The sample of first year resident physicians
(residents) was selected from UMHS FCM and IM
because, as primary care residents, clinical roles and
responsibilities are comparable. One team member
(JLB), a family medicine physician, had valuable
connections with these two provider groups. The
convenience sampling method was used when
selecting participants (Battaglia, 2008) and data
collection was from November 12, 2013 to December
19, 2013.Based on a review of the literature, ten
participants is acceptable in explorative usability
studies to identify salient usability issues (Barnum,
2003, Kim et al., 2012). Residents were compensated
for their participation.
2.4 Scenario and Tasks
In this study, the case presented to the residents was
a ‘scheduled follow up visit after a hospitalization for
pneumonia.’ Five tasks commonly performed by both
expert and novice primary physicians, were
conceptualized for the participants to complete. The
tasks completed were also a part of the EHR training
residents participated in at the start of their residency
to make this evaluation practical and would not
include complex tasks not covered in training. The
tasks had a clear objective that physicians were able
to follow without excessive clinical cognitive
challenges or ambiguity, which was not one of the
study’s goals. Tasks completed were:
Task 1: Place order for chest X-ray
Task 2: Place order for Basic Metabolic Panel (BMP)
Task 3: Change a medication
Task 4: Add a medication to your favorites list
Task 5: Renew one of the existing medications
ComparingComputerizedPhysicianOrderEntryUsabilitybetweenExpertandNovicePrimaryCarePhysicians
305
2.5 Data Analysis
The overall objective of this study was to determine
usability gaps in CPOE systems between expert and
novice physicians. Morae Recorder was used to
capture audio, video, on-screen activity, and inputs
from the keyboard and mouse. Morae Manager was
then used to analyze the recorded sessions by
calculating performance measures and with markers,
code difficulties, errors, and complete the subtask
analysis. Approximately 1.5 hours of video analysis
were required per recorded session of 20 minutes. The
first step in analysis was to review the recorded
session and label any tasks that were not marked
during data collection. The second step was to
subdivide each of the five tasks into smaller tasks in
order to calculate the task success rate and identify
subtle usability challenges that may have otherwise
been missed. The t-test was used to compare
performance measures. Pearson’s correlation was
used to determine whether there is a relationship
among SUS and performance measures.
2.6 Sub-task Analysis
Each physician may complete the same task in
various ways, which is why sub tasks are included in
the usability analysis to understand how participants
interact with the system on a more granular level.
Each video recorded session was reviewed and
individual tasks were broken down into smaller sub-
tasks, that were examined and compared across the
participants and tasks to identify subtle usability
challenges in the form of errors, workflow, and
navigation pattern variability that otherwise would
have been overlooked. For example, when physicians
complete task 1, “Place order for chest X-ray in one
month” the desired subtasks would be:
1. Go to CPOE
2. Find Chest X-ray
3. Click Done
4. Enter Presenting symptom: type ‘cough,
pneumonia follow-up’
5. Enter Requested time frame: select ‘4
weeks’
6. Enter Requested Start Date/Time: use
calendar to get to <date one month away>
7. Add Supervising Physician: ‘Belden’
8. Click ‘Sign’
To analyze our data, thematic analysis was utilized to
report our usability findings (Braun and Clarke,
2006). Some themes identified in this study were
adopted from a study by Walji et al (Walji et al.,
2013) but were modified to include implications to
clinical workflow. Usability issues were recorded and
an example was included to explain where the issue
took place. An attending physician and experiences
usability expert (JLB) was included in the discussion
of implications on clinical practice or workflow and
then contributed suggestions for improvement.
2.7 Performance Measures
Four performance measures were used to analyze user
performance as follows:
1. Percent task success rate, which was computed
by determining the percentage of subtasks that
participants completed successfully without any
errors.
2. Time-on-task which measures the duration of
time each participant took to complete a given
task, beginning when participants clicks ‘start
task’ to when ‘end task’ is clicked.
3. Mouse clicks which measures the counts of
clicks on the mouse when completing a given
task.
4. Mouse movement computes in pixels the length
of the navigation path to complete a given task.
For time on task, a lower value signifies higher
performances. For mouse clicks and mouse
movements, lower values signify higher
performance. Higher values may depict that the
participant had difficulties with the system.
2.8 System Usability Scale
To supplement the performance measures, each
participant completed a system usability scale (SUS),
a ten-item Likert scale that is an overall, subjective
assessment of a system. SUS yields a single number
that illustrate a composite measure of the overall
usability of the system under analysis. SUS yields a
score from 0 to 100, with 100 being a perfect score
(Brooke, 1996). Score of 0 to 50 is considered not
acceptable, 50 to 62 is considered low marginal, 63 to
70 is considered high marginal, 70 to 100 was
considered acceptable (Bangor et al., 2009).
3 RESULTS
3.1 Participants
Seven novice physicians were from FCM and three
were from the IM at UMHS. The age of novice
physicians ranged from 27 to 31 and the mean age
was 28 years. Four (40%) novice physicians had no
other experience with an EHR other than the EHR at
HEALTHINF2015-InternationalConferenceonHealthInformatics
306
UMHS, two (20%) have less than 3 months
experience, one (10%) had 7 months to one year
experience, and three (30%) had over 2 years’
experience with an EHR other than current CPOE.
Six family medicine and one internal medicine expert
physicians participated in the study. Two expert
physicians did not provide information on their date
of birth and EHR experience and were not included
in the calculation of age range, mean age, and EHR
experience. The age of expert physicians ranged from
30 to 62 and the mean age was 37 years. One (14%)
expert physician had no other experience with an
EHR other than the EHR at UMHS, one (14%) had 7
months to one year experience, and three (43%) had
over 2 years experience with an EHR other than
current CPOE.
3.2 Performance Measures
Geometric mean values (Cordes, 1993) of percent
task success rates (50%, expert group vs. 50%, novice
group), time on task (39s, expert group vs. 45s, novice
group), mouse clicks (9 clicks, expert group vs. 10
clicks), and mouse movements (8,802 pixels, expert
group vs. 8,146 pixels, novice group) of five tasks
were compared between the expert and novice
physicians across two rounds. There was no
significant difference in percent task success rate (t(8)
= 2.31, p=0.98), time on task t(8) = 2.31, p=0.59),
mouse clicks (t(8) = 2.31, p=0.64), and mouse
movement (t(8) = 2.31, p=0.70) which means we fail
to reject the null hypothesis.
To determine whether novice physicians
experience with another CPOE affected their
performance measures in this study, geometric mean
values of novice physicians with over 1 year previous
experience with an EHR were compared with novice
physicians with less than 1 year experience. There
was no significant difference in the task success rate
(47%, novices > 1 year previous experience vs. 51%,
novices < 1 year experience; (t(8) = 2.31, p = 0.91),
time on task (33, novices > 1 year previous
experience vs. 52, novices < 1 year experience; t(8) =
2.31, p = 0.62), mouse clicks (6 clicks, novice
physicians with over 1 year previous experience vs.
12 clicks, novice physicians with less than 1 year
experience; t(8) = 2.31, p = 0.81), and mouse
movements (3,445 pixels, novice physicians with
over 1 year previous experience vs. 11779 pixels,
novice physicians with less than 1 year experience;
(t(8) = 2.31, p=0.57) between the novice physicians
with over 1 year previous experience and novice
physicians with less than 1 year experience.
3.3 System Usability Scale
All ten novice physicians and six out of seven expert
physicians completed the SUS after the usability test.
The SUS demonstrated that novice physicians rated
the system usability at a mean of 68 (high marginal)
and experts rated it at a mean of 70 (acceptable). Two
novice physicians and one expert physician gave a
score below 50 (not acceptable). This result may
indicate that novice and expert users of the CPOE still
might not accept the product regardless of proficiency
or length of time using the system. The Pearson
correlation coefficient between task success rate, the
most objective performance measure, and
participants’ individual SUS score highlights that
participants task success may have almost no relation
to how user-friendly participants regarded the CPOE
system (r=0.09).
3.4 Usability Issues Identified through
Sub Task Analysis
Sub-task analysis was also instrumental in identifying
multiple usability concerns. There were seven
common and four unique usability issues identified
between the two physician groups (Table 1). Three
themes emerged during analysis: user interface
issues, ambiguous terminologies, and training and
education issues. The majority of usability issues may
have an impact on the time both novice and expert
physicians would spend completing orders instead of
with their patients because of user interface issues and
ambiguous terms. Training and educational issues
also arose for both novice and expert physicians,
which could be alleviated by improving training on
these specific issues.
4 DISCUSSION
4.1 User Interface Usability Issues
Poor user interface design of CPOEs may cause
usability issues and increase the risk of medical errors
if important information is not presented in an
effective manner (Khajouei and Jaspers, 2010). Poor
interface design may create difficulties for
physicians, especially novice physicians, to find
certain information, which may lead to unsuccessful
searches further frustrating physicians (Horsky et al.,
2004, Zhan et al., 2006). Many adverse drug events
for example resulted from poor CPOE interface
design rather than from human error (Khajouei and
Jaspers, 2010, Koppel et al., 2005, Peute and Jaspers,
ComparingComputerizedPhysicianOrderEntryUsabilitybetweenExpertandNovicePrimaryCarePhysicians
307
Table 1: Usability issues identified from sub task analysis, their implications on practice and suggestions.
Usability Issue and Example Implication on clinical
practice/workflow
Suggestion
USER INTERFACE ISSUES
Inconsistent ordering of command/action buttons
The location of buttons ‘Orders for Signature’, ‘Sign’,
and ‘Done’ varied depending on the window that is
being used.
Orders may not get completed.
Users ignore the alert warning that
‘some tasks are not complete. Are
you sure you want to leave this
chart?’
Do card sort /user mental
mapping process to see
what terms users find
more natural.
Illogical ordering of lists
Medication list cannot be alphabetized when imported
into a patients visit note.
Non-alphabetized lists frustrate
physicians when they cannot figure
out how to sort the medication list.
Import medication list to
visit note in the order that
physicians had them
sorted in the CPOE.
Unclear menu options
To change a medication you either use ‘Renew’,
‘Cancel/DC’, or ‘Cancel/Reorder’.
Physicians make the wrong choices,
take longer to complete the task
because language is confusing.
Test the language in the
menus with actual users in
a group session to the best
terms to use.
Hiding functionalities one layer down
Physicians cannot add medication to a favorite list from
the medication list in a patient’s visit note. Adding a
med to favorites can only be done in the order detail
view, not in the main medication list view.
Physicians are less like to build a
favorite menu therefore they cannot
take advantage of this functionality.
Allow the option to add a
medication favorite by
right clicking the main
medication list.
Extra mouse clicks
To see the changes that were made in the CPOE, the
‘Refresh’ button needs to be clicked for the changes to
appear.
Physicians may not notice there is
new information and act without the
new piece of information. Confused
and frustrated because they expect
the results to automatically update.
The ‘Refresh’ button
improves performance by
reducing frequent queries
to the database.
Users need to be trained
to remember to click
‘Refresh.’
AMBIGUOUS TERMINOLOGIES
Multiple fields with the same functionality
There is no clear difference between the drop down
labeled ‘Requested Start Date’, the drop down labeled
‘Requested Time Frame’, and the radio button labeled
‘Future Order.’
Future labs may not be ordered
properly so labs may not be
completed at the right time. Patients
may have to get the test redone
which brings additional cost to the
patient.
Remove fields that may
be duplicates.
Search results do not match users’ expectations
A search for BMP, blood tests that provides
information about patient’s body's metabolism
(MedlinePlus, 2013), retrieves multiple versions of
same test with different order detail completion.
Takes extra effort for physicians to
complete orders.
Pare down menu options.
Remove unnecessary
option or simplify menu
choices.
Vague wording for alerts
A novice physician tried to order a chest X-ray but
continuously received an error: ‘Radiology orders
should be placed following downtime procedures
during 2200 and 0000.’
Physician becomes frustrated and
spends time trying to decipher the
meaning of the alert.
Create more meaningful
alerts where the physician
can clearly understand the
next steps.
Unexpected terms in date fields
An expert physician did not create a future order for
one month because the terminology used was ‘four
weeks’ and the users kept searching for ‘one month.’
It is confusing and takes doctors a
little longer to complete orders.
Add an additional choice
that says ‘1 month.’
TRAINING AND EDUCATION
Extra steps to complete multiple orders
Novice physicians did not know how to create two
orders at the same time. One novice physician
mentioned that there was probably a way to order them
both but did not know how.
Take more steps to complete
multiple orders.
Make the new orders
being processed more
visible to the user.
Entering the date in the wrong field
An expert physician put the future date in the comments
field to place a future lab order instead of using the
structured date entry field.
If the date is not inputted properly,
the labs may be completed at the
wrong time.
Educate physicians on
best practices for inputting
future orders.
HEALTHINF2015-InternationalConferenceonHealthInformatics
308
2007). Inconsistent ordering of command/action
buttons may cause a medication being prescribed to
not be completed because the physician may click
on the wrong button and cancel the order instead of
ordering the medication. Physicians have very
limited time during clinical encounters. During our
study, physicians did not have the option to reorder
the medication list alphabetically. They also faced
difficulty when changing a medication and spent
more time than necessary adding a medication to a
favorites list. Usability issues, such as, illogical
order of terms and unclear options for specified
tasks, affects the limited time physicians have to see
patients and may negatively affect clinical
workflow. CPOE interface designs that do not
integrate with physicians’ behavior and decision
making processes, may cause inefficient workflow
(Khajouei and Jaspers, 2010). A study by Walji et
al., evaluating the usability of a dentistry EHR
interface, also detected several challenges from
poor user interface similar to this study, such as
illogical ordering of terms and time consuming
processes to complete simple tasks (Walji et al.,
2013).
4.2 Ambiguous Terminology
Ambiguous terminologies can create errors when
physicians are trying to complete a task in the
CPOE. For example, one novice physician was not
able to order a chest X-ray for task 1 and did not
understand the meaning of the alert presented before
him. If a physician received this alert during clinical
workflow, the physician would not be able to order
the chest X-ray in a timely manner and may have to
return to the order later and instead, may forget to
re-order the X-ray. He may also miss a critical
diagnosis that could have been identified from the
patient receiving the X-ray. A study by Yui et al.,
evaluating the satisfaction of physicians with the
CPOE system, also found usability issues where
keyword searches did not produce expected results.
Physicians were not able to locate a common test,
‘urine analysis’, by typing ‘urine’ (Yui et al., 2012).
4.3 Training and Education Issues
According to a study by Ghahramani et al.,
evaluating CPOE’s impact on workplace stress and
job performance, stated that training of clinicians’
CPOE use should start during medical or nursing
schools to increase familiarity and to improve
patient safety and efficiency (Ghahramani et al.,
2009). In this study, no statistically significant
differences were found between expert and novice
physicians’ performance measures but novice
physicians expressed slightly less satisfaction with
the CPOE than expert physicians, which disproves
our hypothesis. A study done by Kim et al. (Kim et
al., 2012), analyzing usability gaps between expert
and novice emergency department nurses, found
similar results where no substantial difference was
found in task success rate on EHR use between two
nurse groups with varying expertise. The results
from our study may suggest that there was no
increase in learning as experience with CPOE
increased. Lack of appropriate training before use of
CPOE may cause more medication errors and
adverse drug events. Physicians, who participated in
a study conducted by Yui et al. also mentioned that
inexperience with the system was from a lack of
training. Senior attending physicians believed that
their unfamiliarity with the CPOE system stemmed
from a lack of targeted training program (Yui et al.,
2012). Also, a study by Devine et al., evaluating the
effect of an ambulatory CPOE on medication errors
and ADEs, found that after implementing and
training physicians on the CPOE, frequency of
errors declined from 18.2% to 8.2% (Devine et al.,
2010).
4.4 Limitations to This Study
This pilot study was successful in identifying gaps
in usability issues and performance measures
between novice and expert physicians but also
contained several methodological limitations. This
study was limited to primary care, a small sample
size, and consisted of one CPOE from one
healthcare institution which means results may not
be generalizable to other specialties and other
healthcare institutions. The usability test was also
conducted using a limited number of clinical tasks
and may not represent other actions taken in
different clinical scenarios. This study was
conducted in a laboratory setting which does not
account for distractions physicians may face during
a clinical encounter. Although the SUS survey was
able to measure user acceptance differences on a
cumulative level, one complex task may affect the
SUS score given by novice physicians. Although
this study contains some methodological limitation,
this is a well-controlled study using rigorous
triangular evaluation and instructions were clear to
the physicians which allowed participants to
complete the required tasks.
ComparingComputerizedPhysicianOrderEntryUsabilitybetweenExpertandNovicePrimaryCarePhysicians
309
5 CONCLUSIONS
These results show that higher experience levels
with CPOE is not equivalent to being an expert and
proficient in using a CPOE. These results may also
assist CPOE vendors in improving the user interface
for physicians to use the CPOE effectively, which
may increase physicians’ performance by reducing
errors caused from poor usability of the system.
Including users in the development or redesign of
CPOE may assist in user performance. For example,
testing the language in the menus with actual
physician users in a group session may help to
identify best terms to use in menu items that users
may find more natural. This redesign may improve
physicians’ accuracy when completing tasks in the
CPOE. This pilot provides sufficient preliminary
data for a larger, evaluative study of usability issues
of CPOE including multiple institutions and CPOE
vendors. Future studies should include a larger
sample of physicians and broaden the scope to
specialty physicians.
6 REFERENCES
1998. ISO 9241-11: Ergonomic Requirements for Office
Work with Visual Display Terminals (VDTs): Part
11: Guidance on Usability, International
Organization for Standardization.
2001. Crossing the quality chasm: A new health system
for the 21st century, Washington, DC, National
Academies Press.
2006. Committee on Identifying and Preventing
Medication Errors. Preventing Medication Errors,
Washington, DC, The National Academies Press
2006.
2010a. National Ambulatory Medical Care Survey: 2010
Summary Tables. CDC/National Center for Health
Statistics.
2010b. Selected patient and provider characteristics for
ambulatory care visits to physician offices and
hospital outpatient and emergency departments:
United States, 2009-2010. FastStats - Diabetes.
Hyattsville, MD: Centers for Disease Control and
Prevention, National Center for Health Statistics.
2011a. Healthcare Reform: Impact on Physicians. Health
Capital Topics. HealthCapital.com: Health Capital
Consultants.
2011b. MU 2011 Annual Report (Online). Available:
http://www.mydigitalpublication.com/publication/?i
=106794 (Accessed April 15 2012).
2011c. U.S. EMR Adoption Model Trends (Online).
Chicago, IL: Health Information Management
Systems Society Analytics. Available:
http://www.himssanalytics.org/docs/HA_EMRAM_
Overview_ENG.pdf (Accessed 02/21 2014).
2013a. Health, United States, 2012: With Special Feature
on Emergency Care, Hyattsville MD.
2013b. University Of Missouri Health Care Achieves
Highest Level of Electronic Medical Record Adoption
(Online). Columbia, MO. Available: http://
www.muhealth.org/body.cfm?id=103&action=detail
&ref=311.
Ash, J. S., Stavri, P. Z. & Kuperman, G. J. 2003. A
consensus statement on considerations for a
successful CPOE implementation. Journal of the
American Medical Informatics Association, 10, 229-
34.
Bangor, A., Kortum, P. & Miller, J. 2009. Determining
what individual SUS scores mean: adding an
adjective rating scale. Journal of Usability Studies, 4,
114-123.
Barnum, C. 2003. The magic number 5: Is it enough for
web testing? Information Design Journal, 11, 160-
170.
Battaglia, M. 2008. Convenience sampling. In P.
Lavrakas (Ed.), Encyclopedia of survey research
methods., Thousand Oaks, CA, Sage Publications,
Inc.
Berger, R. G. & Kichak, J. P. 2004. Computerized
physician order entry: helpful or harmful? Journal of
the American Medical Informatics Association, 11,
100-103.
Braun, V. & Clarke, V. 2006. Using thematic analysis in
psychology. Qualitative Research in Psychology, 3,
77-101.
Brooke, J. 1996. SUS-A quick and dirty usability scale.
Usability Evaluation in Industry, 189, 194.
Chan, J., Shojania, K. G., Easty, A. C. & Etchells, E. E.
2011. Usability evaluation of order sets in a
computerised provider order entry system. BMJ Qual
Saf, 20, 932-40.
Cordes, R. E. 1993. The effects of running fewer subjects
on time-on-task measures. International Journal of
Human-Computer Interaction, 5, 393 - 403.
Devine, E. B., Hansen, R. N., Wilson-Norton, J. L.,
Lawless, N. M., Fisk, A. W., Blough, D. K., Martin,
D. P. & Sullivan, S. D. 2010. The impact of
computerized provider order entry on medication
errors in a multispecialty group practice. Journal of
the American Medical Informatics Association, 17,
78-84.
Ghahramani, N., Lendel, I., Haque, R. & Sawruk, K.
2009. User satisfaction with computerized order entry
system and its effect on workplace level of stress.
Journal of Medical Systems, 33, 199-205.
Higgins, J. 2005. The Correlation Coefficient. The
Radical Statistician, 1-26.
Horsky, J., Kaufman, D. R. & Patel, V. L. 2004.
Computer-based drug ordering: evaluation of
interaction with a decision-support system. Studies in
Health Technology and Informatics, 107, 1063-7.
Khajouei, R. & Jaspers, M. W. 2010. The impact of CPOE
medication systems' design aspects on usability,
workflow and medication orders: a systematic review.
Methods of Information in Medicine, 49, 3-19.
HEALTHINF2015-InternationalConferenceonHealthInformatics
310
Kim, M., Shapiro, J., GENES, N., Aguilar, M., Mohrer,
D., Baumlin, K. & Belden, J. 2012. A pilot study on
usability analysis of emergency department
information system by nurses. Applied Clinical
Informatics, 135-153.
Kjeldskov, J., Skov, M. B. & Stage, J. 2010. A
longitudinal study of usability in health care: does
time heal? Int J Med Inform, 79, e135-43.
Koppel, R., Metlay, J. P., Cohen, A., Abaluck, B.,
Localio, A. R., Kimmel, S. E. & Strom, B. L. 2005.
Role of computerized physician order entry systems
in facilitating medication errors. JAMA, 293, 1197-
203.
Medlineplus. 2013. Basic metabolic panel (Online).
National Library of Science, National Institues of
Health. Available: http://www.nlm.nih.gov/
medlineplus/ency/article/003462.htm (Accessed 2/25
2014).
Neinstein, A. & Cucina, R. 2011. An analysis of the
usability of inpatient insulin ordering in three
computerized provider order entry systems. J
Diabetes Sci Technol, 5, 1427-36.
Peute, L. W. & Jaspers, M. W. 2007. The significance of
a usability evaluation of an emerging laboratory order
entry system. International Journal of Medical
Informatics, 76, 157-68.
Schoen, C., Osborn, R., Huynh, P. T., Doty, M., Peugh, J.
& Zapert, K. 2006. On the front lines of care: primary
care doctors office systems, experiences, and views in
seven countries. Health Affairs, 25, w555-w571.
Centers for Medicare & Medicaid Services. 2010.
Eligible professional meaningful use core measures,
measure 14 of 15. Stage.
Walji, M. F., Kalenderian, E., Tran, D., Kookal, K. K.,
Nguyen, V., Tokede, O., White, J. M., Vaderhobli, R.,
Ramoni, R., Stark, P. C., Kimmes, N. S.,
Schoonheim-Klein, M. E. & Patel, V. L. 2013.
Detection and characterization of usability problems
in structured data entry interfaces in dentistry.
International Journal of Medical Informatics, 82,
128-138.
Yui, B. H., Jim, W. T., Chen, M., Hsu, J. M., Liu, C. Y.
& Lee, T. T. 2012. Evaluation of computerized
physician order entry system-a satisfaction survey in
Taiwan. Journal of Medical Systems, 36, 3817-24.
Zhan, C. L., Hicks, R. W., Blanchette, C. M., Keyes, M.
A. & Cousins, D. D. 2006. Potential benefits and
problems with computerized prescriber order entry:
Analysis of a voluntary medication error-reporting
database. American Journal of Health-System
Pharmacy, 63, 353-358.
ComparingComputerizedPhysicianOrderEntryUsabilitybetweenExpertandNovicePrimaryCarePhysicians
311