Burgard, W., Cremers, A. B., Fox, D., Hähnel, D.,
Lakemeyer, G., Schulz, D., Steiner, W., and Thrun, S.
(1999). Experiences with an interactive museum tour-
guide robot. Artificial Intelligence, 114(1–2), 3–55
Courage, C. and Baxter, K., 2005. Understanding your
users: A practical guide to user requirements methods,
tools, and techniques. Gulf Professional Publishing.
DCMS (2020). Taking Part 2019/20: Cross-sectional
survey. Technical Report. Available https://assets.pub
lishing.service.gov.uk/government/uploads/system/upl
oads/attachment_data/file/916246/Taking_Part_Techni
cal_Report_2019_20.pdf. Retrieved 27 July 2022.
Del Vacchio, E., Laddaga, C., & Bifulco, F. (2020). Social
robots as a tool to involve student in museum
edutainment programs. In Proceedings of the 29th IEEE
International Conference on Robot and Human
Interactive Communication (RO-MAN), 476-481.
Dourish, P. (2001). Where the action is: the foundations of
embodied interaction. MIT Press, Cambridge, Mass.
French, A. and Villaespesa, E. (2019). AI, visitor
experience, and museum operations: a closer look at the
possible. In Humanizing the Digital: Un-proceedings of
the MCN 2018 Conference, 101-113.
Gaia, G., Boiano, S., and Borda, A. (2019). Engaging
museum visitors with AI: The case of chatbots.
Museums and digital culture, 309-329. Springer.
Goel, A., Tung, C., Lu, Y. H., and Thiruvathukal, G. K.
(2020). A survey of methods for low-power deep
learning and computer vision. In 6th World Forum on
Internet of Things (WF-IoT), 1-6. IEEE.
Harvard Art Museums (2022). AI Explorer: Explore how a
computer sees art. Available https://ai.harvardart
museums.org/about. Retrieved 11 August 2022.
Hincapié-Ramos, J. D., Guo, X., Moghadasian, P., and
Irani, P. (2014). Consumed endurance: a metric to
quantify arm fatigue of mid-air interactions. In
Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems, 1063-1072. ACM.
Hughes-Noehrer, L., Jay, C. and Gilmore, A. (2022).
Museums and AI Applications (MAIA) Survey.
University of Manchester. Dataset.
https://doi.org/10.48420/19298588.v1
Jang, S., Stuerzlinger, W., Ambike, S., and Ramani, K.
(2017). Modeling cumulative arm fatigue in mid-air
interaction based on perceived exertion and kinetics of
arm motion. In Proceedings of the 2017 CHI
Conference on Human Factors in Computing Systems,
3328-3339. ACM.
Lee, L., Okerlund, J., Maher, M. L., & Farina, T. (2020,
July). Embodied interaction design to promote creative
social engagement for older adults. In International
Conference on Human-Computer Interaction, 164-183.
Springer, Cham.
Lin, T. Y., Maire, M., Belongie, S., Hays, J., Perona, P.,
Ramanan, D., Zitnick, C. L., and Dollár, P. (2014).
Microsoft COCO: Common objects in context. In
Proceedings of the European conference on computer
vision (ECCV), 740-755. Springer.
Lindgren, R., Tscholl, M., Wang, S., and Johnson, E.
(2016). Enhancing learning and engagement through
embodied interaction within a mixed reality simulation.
Computers & Education, 95, 174-187.
Mihailova, M. (2021). To dally with Dalí: Deepfake (Inter)
faces in the art museum. Convergence, 27(4), 882-898.
Miles, M.B., and Huberman, A.M. 1984. Qualitative Data
Analysis. Newbury Park, CA: Sage.
Mollica, J. (2017). Send Me SFMOMA. Available
https://www.sfmoma.org/read/send-me-sfmoma/.
Retrieved 6 August 2022.
Pitsch, K., Wrede, S., Seele, J. C., and Süssenbach, L.
(2011). Attitude of german museum visitors towards an
interactive art guide robot. In Proceedings of the 6th
international conference on Human-robot interaction,
227-228. ACM.
Schrepp (2017). UEQ Data Analysis Tool. Available
https://www.ueq-online.org/Material/Short_UEQ_
Data_Analysis_Tool.xlsx. Retrieved 26 July 2022.
Schrepp, M., Hinderks, A., and Thomaschewski, J. (2017):
Design and Evaluation of a Short Version of the User
Experience Questionnaire (UEQ-S). IJIMAI, 4 (6),
103–108.
Smilkov, D., Thorat, N., Assogba, Y., Nicholson, C.,
Kreeger, N., Yu, P., Cai, S., Nielsen, E., Soegel, D.,
Bileschi, S. and Terry, M. (2019). Tensorflow. js:
Machine learning for the web and beyond. Proc. of
Machine Learning and Systems, 1, 309-321.
Tan, L., and Chow, K. K. (2017). Facilitating meaningful
experience with ambient media: an embodied
engagement model. In Proceedings of the 5th
International Symposium of Chinese CHI, 36-46.
Tate (2016). Can a machine make us look afresh at great art
through the lens of today’s world? IK Prize 2016:
Recognition. Available https://www.tate.org.uk/whats-
on/tate-britain/exhibition/ik-prize-2016-recognition.
Retrieved 11 August 2022.
The Metropolitan Museum of Art (2022) The Met Art
Explorer. Available https://art-explorer.azureweb
sites.net/search. Retrieved 11 August 2022.
van Beurden, M.H., Ijsselsteijn, W.A., de Kort, Y.A.
(2012). User Experience of Gesture Based Interfaces: A
Comparison with Traditional Interaction Methods on
Pragmatic and Hedonic Qualities. LNCS, vol 7206, 36-
47. Springer, Berlin.
Voter, R. and Li, N. (2021). Next-Generation Pose
Detection with MoveNet and TensorFlow.js. Available
https://blog.tensorflow.org/2021/05/next-generation-
pose-detection-with-movenet-and-tensorflowjs.html.
Retrieved 26 July 2022.
Winter M., Jackson P. (2020) Flatpack ML: How to Support
Designers in Creating a New Generation of
Customizable Machine Learning Applications. LNCS
vol 12201, 175-193. Springer Nature.