responses to the free text question that show the most
popular feedback.
4 SUMMARY OF SURVEY
RESULTS
The survey results can be summarized as follows:
Overall, the RSNA attendees who received a demon-
stration of the AI tool were impressed with the sys-
tem and receptive to adoption. Nearly 85% were im-
pressed by the technology, and the majority reported
that they would like to use such a tool in the future.
15% of survey takers did not believe that a cognitive
assistant would help improve their workflow, exhibit-
ing some skepticism. As seen in Figure 4, the ma-
jority of participants also indicated that they would
use such a tool if validated by their own or a peer’s
personal experience, or if shown to be valid in a peer-
reviewed publication. Respondents, reportedly, were
less likely to be convinced by a government entity or
local validation.
Despite the much-published concern that AI sys-
tems may threaten physician jobs, the RSNA atten-
dees’ responses suggest that they are most motivated
by clinical outcomes. As AI continues to improve, it
will likely become a regularly used tool in all aspects
of healthcare. An intelligent cognitive assistant could
be a major factor that helps reduce clinical workload
and allows physicians to focus on their primary pur-
pose – the patient. Our results indicate that most peo-
ple who participated in this experience are open and
ready for a future of AI augmented medicine.
While this study showed promise for AI aided
cognitive assistants, it has a few limitations. First,
we did not rigorously collect accuracy data or vali-
date the credentials of the test-takers; our information
was derived from data collected from scanned RSNA
badges and system interaction. As a result, we cannot
report whether the user’s impressions were influenced
by their individual performance relative to the system.
Their receptivity could conceivably be influenced by
a technology that is either so outstanding that it is
threatening, or so poor that it is not helpful. Secondly,
we could not control against bias. For example, we do
not know if the attendees were self-selected because
of their interest in this technology. We did not col-
lect data that might have indicated whether the exhibit
changed a user’s impression of the technology.
Nevertheless, the results indicate that imaging
professionals are open to the use of artificial intel-
ligence technologies to provide cognitive assistance,
particularly if validated by personal experience, a peer
reference, or published research.
REFERENCES
Google (2016). Sankey diagram. https://developers.google.
com/chart/interactive/docs/gallery/sankey, note = Ac-
cessed: 2019-10-03.
Pillai, A., Katouzian, A., Kanjaria, K., Shivade, C., Jad-
hav, A., Bendersky, M., Mukherjee, V., and Syeda-
Mahmood, T. (2019). A knowledge-based question
answering system to provide cognitive assistance to
radiologists. In Medical Imaging 2019: Imaging In-
formatics for Healthcare, Research, and Applications,
volume 10954, page 1095418. International Society
for Optics and Photonics.
Pynoo, B., Devolder, P., Duyck, W., van Braak, J., Sijnave,
B., and Duyck, P. (2012). Do hospital physicians’ at-
titudes change during pacs implementation? a cross-
sectional acceptance study. International journal of
medical informatics, 81(2):88–97.
Riehmann, P., Hanfler, M., and Froehlich, B. (2005). Inter-
active sankey diagrams. In IEEE Symposium on In-
formation Visualization, 2005. INFOVIS 2005., pages
233–240. IEEE.
Rish, I. et al. (2001). An empirical study of the naive bayes
classifier. In IJCAI 2001 workshop on empirical meth-
ods in artificial intelligence, volume 3, pages 41–46.
RSNA (2017). Rsna facts. https://press.rsna.org/timssnet/
media/pressreleases/14 pr target.cfm?ID=1978#
targetText=RSNA\%C2\%AE\%20has\%20over\
%2054\%2C000,registrants\%2C\%20including\
%2026\%2C988\%20healthcare\%20professionals,
note = Accessed: 2019-10-03.
Singh, H., Spitzmueller, C., Petersen, N. J., Sawhney,
M. K., and Sittig, D. F. (2013). Information overload
and missed test results in electronic health record–
based settings. JAMA internal medicine, 173(8):702–
704.
Surveymonkey (2019). Surveymonkey. https://www.
surveymonkey.com/, note = Accessed: 2019-10-03.
Zeleznik, M. P., Maguire Jr, G. Q., and Baxter, B. (1983).
Pacs data base design. In Picture Archiving and Com-
munication Systems, volume 418, pages 287–295. In-
ternational Society for Optics and Photonics.
APPENDIX A
Question 1. During the demonstration, how helpful
was the system’s assistance in your decision-making?
a. Not at all helpful
b. Slightly helpful
c. Neutral
d. Somewhat helpful
e. Very helpful
Question 2. Overall, how impressed were you with
the system demo?
a. Not at all impressed
b. Slightly impressed
Receptivity of an AI Cognitive Assistant by the Radiology Community: A Report on Data Collected at RSNA
185