study are encouraging, we need to develop a larger and independently labelled test
corpus to provide clearer quantitative analysis of system performance. We will use this
to develop and explore the effectiveness of extended schemes to measure arousal and
valence, such as those introduced in [4] which incorporate visual motion activity and
density of shot cuts from the video stream as components in the arousal measure, and
to explore methods for identifying dominant emotions.
In our ongoing work we are extending our study to compare our automatic audio
annotation with the affect labels generated in [7] using the manual audio descriptions.
Results of this comparison may lead to methods of effectively combining the alternative
annotation schemes to provide richer or more reliable affective labelling. We also plan
to explore the use of affect annotation in the automatic comparison of multimedia docu-
ments for both retrieval and classification applications. This information might be used
to recommend items that a user might like, for example movies with similar structure
to those that they have viewed previously.
References
1. Smeaton, A.F., Lee, H. and McDonald, K.: Experiences of Creating Four Video Library
Collections with the Fischlar System, International Journal on Digital Libraries, 4(10) (2004)
42-44
2. Hauptmann, A.G., Christel, M.G.: Successful Approaches in the TREC Video Retrieval
Evaluations, Proceedings of ACM Multimedia 2004, New York City, ACM (2004) 668-675
3. Zhang, T., Kuo, C. C. J.: Content-Based Audio Classification and Retrieval for Audiovisual
Data Parsing, Kluwer Academic Publishers, (2001)
4. Hanjalic, A., Xu, L.-Q.: User-Oriented Affective Video Content Analysis, Proceedings of
the IEEE Workshop on Content-based Access of Image and Video Libraries (CBAIVL’01),
IEEE (2001) 50-57.
5. Picard, R.W., Cosier, G.: Affective Intelligence - the Missing Link?, BT Technology Journal
Vol 14 No 4, (1997)
6. Inanoglu, Z., Caneel, R.: Emotive Alert: HMM-Based Emotion Detection In Voicemail Mes-
sages, Proceedings of the 10th International Conference on Intelligent User Interfaces (IUI
’05), San Diego, ACM (2005) 251-253
7. Salway, A. and Graham, M.: Extracting Information about Emotions in Films, Proceedings
of ACM Multimedia, Berkeley, ACM (2003) 299-302
8. Russell, J., Mehrabian, A.: Evidence for a Three-Factor Theory of Emotions, Journal of
Research in Personality, 11 (1977) 273-294
9. Bradley, M. M.: Emotional Memory: A Dimensional Analysis. In: van Groot, S., van de Poll,
N.E., Sargeant, J. (eds.) The Emotions: Essays on Emotion Theory, Hillsdale, NJ: Erlbaum
(1994) 97-134
10. Dietz, R., Lang, A.: Affective Agents: Effects of Agent Affect on Arousal, Attention, Liking
and Learning, Proceedings of the Third International Cognitive Technology Conference, San
Francisco (1999)
11. Ortony, A, Clore, G.L., Coolins, A.: The Cognitive Structure of Emotions, Cambridge
University Press (1988)
12. Picard, R: Affective Computing, MIT Press (1997)
100