because limited to authoritative or seminal papers.
This highlights the problem of separation between the
standard writing process and large sectors of the
research community that could provide scrutiny of the
scientific basis of prescriptions, if appropriate reward
mechanisms could be organised.
Certainly important to understand is how
standards shape designers’ decisions, focus their
attention, and shift their priorities. Sociological
research seems necessary. This paper is one step
towards addressing challenging human factors
concepts in medical standards.
ACKNOWLEDGEMENTS
We thank Dr. Sebastian Hunt for his insightful advice
to this work. The AQUAS project is funded by
ECSEL JU under grant agreement No 737475. This
paper is derived from an oral presentation at the
Human Factors and Ergonomics European Meeting
held in Nantes, France in October 2019, and we are
grateful for comments received from that audience.
REFERENCES
Alberdi, E., Strigini, L., Povyakalo, A. A., & Ayton, P.
(2009, September). Why are people’s decisions
sometimes worse with computer support?.
In International Conference on Computer Safety,
Reliability, and Security (pp. 18-31). Springer, Berlin,
Heidelberg.
Alberdi, E. P. A., Strigini, L., & Ayton, P. (2014). CAD:
risks and benefits for radiologists’ decision. In Samei,
E., & Krupinski, E. A. (Eds.), The handbook of medical
image perception and techniques (pp 326-330).
Cambridge University Press.
Ahlstrom, V. (2008, September). The usability paradox of
the Human Factors Standard. In Proceedings of the
Human Factors and Ergonomics Society Annual
Meeting (Vol. 52, No. 24, pp. 1994-1998). Sage CA:
Los Angeles, CA: SAGE Publications.
ANSI/AAMI. (2009). Human Factors Engineering –
Design of Medical Devices. (HE75)
FAA. (2016). U.S. Department of Transportation Federal
Aviation Administration. Human Factors Design
Standard. (HF-STD-001B).
FDA. (2018). Content of Premarket Submissions for
Management of Cybersecurity in Medical Devices.
Francis, R. (2017). Hospital devices left vulnerable, leave
patients at risk. Retrieved on June 30, 2019 from
https://www.csoonline.com/article/ 3167911/hospital-
devices-left-vulnerable-leave-patients-vulnerable.html.
Goddard, K., Roudsari, A., & Wyatt, J. C. (2014).
Automation bias: empirical results assessing
influencing factors. International journal of medical
informatics, 83(5), 368-375.
Grassi, P. A., Perlner, R. A., Newton, E. M., Regenscheid,
A. R., Burr, W. E., Richer, J. P., ... & Theofanos, M. F.
(2017). Digital Identity Guidelines: Authentication and
Lifecycle Management [including updates as of 12-01-
2017] (No. Special Publication (NIST SP)-800-63B).
Hartswood, M., Procter, R., Williams, L., Prescott, R., &
Dixon, P. (1997). Drawing the line between perception
and interpretation in computer-aided mammography.
In Proceedings of the First International Conference on
Allocation of Functions (pp. 275-291).
IEC. (2007). The International Electrotechnical
Commission. General Requirements for Basic Safety
and Essential Performance - Collateral Standard:
General requirements, tests and guidance for alarm
systems in medical electrical equipment and medical
electrical systems. (IEC 60601-1-8).
IEC. (2008). The International Electrotechnical
Commission. General Requirements for Basic Safety
and Essential Performance - Collateral Standard:
Requirements for the development of physiologic
closed-loop controllers. (IEC 60601-1-10).
IEC. (2010). The International Electrotechnical
Commission. General Requirements for Basic Safety
and Essential Performance - Collateral Standard:
Usability. (IEC 60601-1-6).
IEC. (2015). The International Electrotechnical
Commission. Application of Usability Engineering to
Medical Devices. (IEC 62366-1).
IEC. (2016). The International Electrotechnical
Commission. Guidance on the Application of Usability
Engineering to Medical Devices. (IEC 62366-2).
Parasuraman, R., & Manzey, D. H. (2010). Complacency
and bias in human use of automation: An attentional
integration. Human factors, 52(3), 381-410.
Povyakalo, A. A., Alberdi, E., Strigini, L., & Ayton, P.
(2013). How to discriminate between computer-aided
and computer-hindered decisions: a case study in
mammography. Medical Decision Making, 33(1), 98-
107.
Tsai, T. L., Fridsma, D. B., & Gatti, G. (2003). Computer
decision support as a source of interpretation error: the
case of electrocardiograms. Journal of the American
Medical Informatics Association, 10(5), 478-483.
van der Peijl, J., Klein, J., Grass, C., & Freudenthal, A.
(2012). Design for risk control: the role of usability
engineering in the management of use-related
risks. Journal of biomedical informatics, 45(4), 795-
812.
Wiegmann, D. A. (2002). Agreeing with automated
diagnostic aids: A study of users' concurrence
strategies. Human Factors, 44(1), 44-50.
Zhang-Kennedy, L., Chiasson, S., & van Oorschot, P.
(2016, June). Revisiting password rules: facilitating
human management of passwords. In 2016 APWG
symposium on electronic crime research (eCrime) (pp.
1-10). IEEE.