sical performance using objective methods, as hap-
pened in previous Netytar evaluations (Davanzo et al.,
2018).
REFERENCES
Badler, J. B. and Canossa, A. (2015). Anticipatory Gaze
Shifts during Navigation in a Naturalistic Virtual En-
vironment. In Proc. of the 2015 Annual Symposium
on Computer-Human Interaction in Play (CHI PLAY
’15), pages 277–283, London, United Kingdom. As-
sociation for Computing Machinery.
Bailey, S., Scott, A., Wright, H., Symonds, I. M., and Ng, K.
(2010). Eye.Breathe.Music: Creating music through
minimal movement. In Proc. Conf. Electronic Visual-
isation and the Arts (EVA 2010), pages 254–258, Lon-
don, UK.
Correa, A. G. D., Ficheman, I. K., do Nascimento, M.,
and Lopes, R. d. D. (2009). Computer Assisted Mu-
sic Therapy: A Case Study of an Augmented Real-
ity Musical System for Children with Cerebral Palsy
Rehabilitation. In Proc. of the 2009 Ninth IEEE In-
ternational Conference on Advanced Learning Tech-
nologies, pages 218–220, Riga, Latvia. IEEE.
Davanzo, N., Dondi, P., Mosconi, M., and Porta, M. (2018).
Playing music with the eyes through an isomorphic
interface. In Proc. of the Workshop on Communica-
tion by Gaze Interaction - COGAIN ’18, pages 1–5,
Warsaw, Poland. ACM Press.
Frid, E. (2019). Accessible Digital Musical Instruments—A
Review of Musical Interfaces in Inclusive Music
Practice. Multimodal Technologies and Interaction,
3(3):57.
Gesierich, B., Bruzzo, A., Ottoboni, G., and Finos, L.
(2008). Human gaze behaviour during action execu-
tion and observation. Acta Psychologica, 128(2):324–
330.
Harrison, J. and McPherson, A. (2017). An Adapted Bass
Guitar for One-Handed Playing. In Proc. of the 17th
Int. Conf. on New Interfaces for Musical Expression
(NIME’17), NIME 2017, Copenhagen, Denmark.
Hornof, A. J. (2014). The Prospects For Eye-Controlled
Musical Performance. In Proc. of the 14th Int. Conf.
on New Interfaces for Musical Expression (NIME’14),
NIME 2014, Goldsmiths, University of London, UK.
Jacob, R. J. K. (1995). Eye tracking in advanced interface
design. In Virtual Environments and Advanced Inter-
face Design, pages 258–288. Oxford University Press,
Inc., USA.
Jamboxx (n.d.). Jamboxx. https://www.jamboxx.com/. Ac-
cessed 8 June 2019.
Jones, M., Grogg, K., Anschutz, J., and Fierman, R. (2008).
A Sip-and-Puff Wireless Remote Control for the Ap-
ple iPod. Assistive Technology, 20(2):107–110.
Larsen, J. V., Overholt, D., and Moeslund, T. B. (2016).
The Prospects of Musical Instruments For People with
Physical Disabilities. In Proc. of the 16th Int. Conf.
on New Interfaces for Musical Expression (NIME’16),
NIME 2016, pages 327–331, Griffith University, Bris-
bane, Australia.
Lou, C. I., Migotina, D., Rodrigues, J. P., Semedo, J., Wan,
F., Mak, P. U., Mak, P. I., Vai, M. I., Melicio, F.,
Pereira, J. G., and Rosa, A. (2012). Object Recog-
nition Test in Peripheral Vision: A Study on the In-
fluence of Object Color, Pattern and Shape. In Zan-
zotto, F. M., Tsumoto, S., Taatgen, N., and Yao, Y.,
editors, Proc. Int. Conf. on Brain Informatics, Lecture
Notes in Computer Science, pages 18–26, Berlin, Hei-
delberg. Springer.
Marquez-Borbon, A. and Martinez Avila, J. P. (2018). The
problem of DMI adoption and longevity: Envisioning
a NIME performance pedagogy. In Proc. of the 18th
Int. Conf. on New Interfaces for Musical Expression
(NIME’18), Blacksburg, Virginia, USA. Virginia Tech
Libraries.
Maupin, S., Gerhard, D., and Park, B. (2011). Isomorphic
Tessellations for Musical Keyboards. In Proc. of 2011
Sound and Music Computing Conf., pages 471–478,
Conservatorio Cesare Pollini, Padova, Italy.
Morimoto, C. H., Diaz-Tula, A., Leyva, J. A. T., and Elmad-
jian, C. E. L. (2015). Eyejam: A Gaze-controlled Mu-
sical Interface. In Proceedings of the 14th Brazilian
Symposium on Human Factors in Computing Systems,
IHC ’15, pages 37:1–37:9, Salvador, Brazil. ACM.
Mougharbel, I., El-Hajj, R., Ghamlouch, H., and Monacelli,
E. (2013). Comparative study on different adaptation
approaches concerning a sip and puff controller for a
powered wheelchair. In Proc. of the 2013 Science and
Information Conf., pages 597–603, London, UK.
Purves, D., Augustine, G. J., Fitzpatrick, D., Katz, L. C.,
LaMantia, A.-S., McNamara, J. O., and Williams,
S. M. (2001). Types of Eye Movements and Their
Functions. Neuroscience. 2nd edition, pages 361–390.
Refsgaard, A. (n.d.). Eye Conductor.
https://andreasrefsgaard.dk/project/eye-conductor/.
Accessed 8 June 2019.
Rusconi, E., Kwan, B., Giordano, B. L., Umilt
`
a, C., and
Butterworth, B. (2006). Spatial representation of pitch
height: The SMARC effect. Cognition, 99(2):113–
129.
Stanford, S., Milne, A. J., and MacRitchie, J. (2018). The
Effect of Isomorphic Pitch Layouts on the Transfer of
Musical Learning †. Applied Sciences, 8(12):2514.
Vamvakousis, Z. and Ramirez, R. (2014). P300 Harmonies:
A Brain-Computer Musical Interface. In Proc. of 2014
Int. Computer Music Conf./Sound and Music Comput-
ing Conf., pages 725–729, Athens, Greece.
Vamvakousis, Z. and Ramirez, R. (2016). The EyeHarp:
A Gaze-Controlled Digital Musical Instrument. Fron-
tiers in Psychology, 7:906.
Ward, A., Woodbury, L., and Davis, T. (2017). Design Con-
siderations for Instruments for Users with Complex
Needs in SEN Settings. In Proc. of the 17th Int. Conf.
on New Interfaces for Musical Expression (NIME’17),
Copenhagen, Denmark.
Zhang, X. and MacKenzie, I. S. (2007). Evaluating Eye
Tracking with ISO 9241 - Part 9. In Jacko, J. A.,
editor, Human-Computer Interaction. HCI Intelligent
Multimodal Interaction Environments, Lecture Notes
in Computer Science, pages 779–788, Berlin, Heidel-
berg. Springer.
CSME 2020 - Special Session on Computer Supported Music Education
628