1 [] 1 63 [< 4 70 [< 3 75 [< 2 79
2 [> 4 70 [> 3 75 [> 2 79 [] 1 75
3 [] 1 74 [< 4 67 [< 3 70 [< 2 79
4 [] 1 70
5 [] 1 72
...
Figure 1: A short example of plain-text notation for J.
Haury’s metapiano. The score contains voice, pitch and
velocity encoding, together with basic information on ar-
ticulations.
hard task, even if theoretically feasible (Baldan et al.,
2009).
Needless to say, an IEEE 1599 document can con-
tain much more, as explained in Section 2. For in-
stance, it could host a number of pre-recorded audio
tracks referring to other performances of the piece,
or conceived as a background for the current perfor-
mance.
1
Similarly, the Notational layer could host
evocative graphics together with a traditional score
version in common Western notation.
A number of IEEE 1599 applications oriented to
music education has been treated in (Barat
`
e et al.,
2009) and (Barat
`
e and Ludovico, 2012). In this con-
text, the novelty is the presence of meta-instrument
notation. Usually it contains basic symbolic informa-
tion (i.e. notes, rests, a few articulation signs, etc.),
namely the input required by the meta-instrument
parser. A simple example is the notation for the
metapiano by Jean Haury, illustrated in Figure 1. It
is worth underlining that the information contained
in a meta-instrument score is potentially redundant
with the contents of the Logic layer, and actually
the knowledge of encoding rules makes an automatic
conversion possible between formats.
Moreover, software tools and plug-ins have been
developed to compile the Logic layer starting from
commonly adopted formats (e.g. MusicXML and
MIDI) as well as score editing software (e.g. Mus-
eScore, MakeMusic Finale
R
and Sibelius
R
). Simi-
larly, computer applications could be implemented for
ad hoc meta-instrument scores, too.
IEEE 1599 provides richness in music description,
including multiple audio, video and score digital ob-
jects. Since the format supports any representation
of score symbols, also new notation for music meta-
instruments can be embedded and synchronized with
all the other contents.
1
Please note that in this case timing information would
be implicitly provided to the human player. Such a result
could be either desirable, e.g. to teach students how to go
in time with the music, or unwanted, e.g. to make children
express themselves during Music Therapy sessions.
After producing the IEEE 1599 document, the
second phase - i.e. music performance - is enabled
to start. Before the design of this framework, two to-
tally independent concepts were available:
• An IEEE 1599 viewer, namely an environment
oriented to a multi-layer and synchronized musi-
cal experience. This software is able to present si-
multaneously information contents from multiple
layers, allowing the user to enjoy them together
and to choose the material to bring to front. The
user is active in the choice of current materials
(scores, audio tracks, video clips, etc.), and he/she
can use standard navigation controls (start, stop,
pause, change current position); however, from
the performance point of view, the user can only
experience already prepared materials.
• A meta-instrument parser, where a symbolic score
is loaded and the user can interact through the in-
terface of the musical instrument. The parser is
not standard, since it is customized for the pecu-
liar meta-instrument. Besides, it usually gets in-
put only from the external controller and from a
digital score representation. Consequently, other
interactions with related materials is demanded to
a posteriori processing of its output, which limits
the expressive possibilities of the framework.
These two environments could be (and actually
have been) implemented under different HW/SW ar-
chitectures, and implementation details are not rele-
vant for our proposal. For instance, IEEE 1599 play-
ers have been developed for multi-platform off-line
fruition as well as embedded in Web portals. Sim-
ilarly, there are some meta-instruments entirely im-
plemented via software and others based on the com-
munication between Arduino and Max/Msp environ-
ment. An example of the latter category will be pro-
vided in Section 4. Our idea is creating a unique
framework where the two contributions can be mixed
and integrated, in order to take advantage by both the
approaches.
As regards the music meta-instrument, it can be
any hardware or software device capable of sending
computer-interpretable messages: MIDI controllers,
external peripherals such as computer keyboards,
graphical interfaces, and so on.
The function of the parser is interpreting both the
IEEE 1599 and the controller input, producing a se-
quence of commands to drive the player. One of the
key roles is disambiguating synchronization. As men-
tioned before, most contents in an IEEE 1599 doc-
ument have intrinsic timing information, such as all
audio and video tracks. On the contrary, in this con-
text metronome is provided by the human player, so
AnIEEE1599FrameworktoPlayMusicIntuitively-TheMetapianoCaseStudy
411