Sonification of Performers’ Ancillary Gestures Vincent Verfaille, Oswald Quek & Marcelo M. Wanderley Input Devices and Music Interaction Lab. (IDMIL) Sound Processing and Control Lab. (SPCL)
C I R MM T
Centre for Inderdisciplinary Research in Music Media and Technology
June 23, 2006
V. Verfaille, O. Quek & M. M. Wanderley
Sonification of Performers’ Ancillary Gestures
Context: analysis of gestural control To create new digital musical instruments (DMI): sound synthesizer, sensors, mapping =⇒ and a better understanding performers’ gestures (analysis) Video example: 1’ excerpt of the Stravinsky’s piece Sonifications
Performer: Louise Campbell
Focus on ancillary gestures: =⇒ gestures not directly related to sound production.
1/11
V. Verfaille, O. Quek & M. M. Wanderley
Sonification of Performers’ Ancillary Gestures
When is sonification needed? To analyse gestures, we: 1) define which kind of gestures are studied (ancillary) 2) use sensors to quantify the different kinds of movements 3) present such information using visual methods (e.g. graphs or tables) 4) perform some analysis (somewhat related movements) ... but monitoring several 3D-curves at once is not easy! Any other solution? =⇒ using sonification! (coupling sonification, gestural control & sound synthesis)
2/11
V. Verfaille, O. Quek & M. M. Wanderley
Sonification of Performers’ Ancillary Gestures
Sonification Rationale
Sonification = “transformation of data relations into perceived relations in an acoustic signal for the purposes of facilitating communication or interpretation” [Barras & Kramer, 1999] helps to reveal structures in data that are not obvious in visual-only analysis [Pauletto & Hunt, 2004] represents data sets with large numbers of changing variables / temporally complex information (blurred or missed by visual displays) frees the listener’s cognitive load & enables to focus on data’s important aspects [Barras & Kramer, 1999]
3/11
V. Verfaille, O. Quek & M. M. Wanderley
Sonification of Performers’ Ancillary Gestures
Quick Comparison: Sonification – DMI Design Some common aspects: make good sounds from control curves/gestures mapping strategy issues: 1) N-to-M mapping: N data signals −→ M control signals 2) signal conditioning
choice of “good synthesizers” recent “discipline” a beginning of self-organization concerning design guidelines, re-usability, methodology, evaluation, etc. [Wanderley, 2002 (OS); Frauenberger, 2006 (ICAD)]
4/11
V. Verfaille, O. Quek & M. M. Wanderley
Sonification of Performers’ Ancillary Gestures
Setup
3 clarinettists performing 6 times: Stravinsky’s Three Pieces for Solo Clarinet 3 expressiveness manners: immobile, standard and exaggerated [Wanderley et al, 2004] recorded with video cameras & Optotrak system 3020 (optical tracker with active infra-red markers) sonifications in ‘semi real-time’: offline pre-processing (Matlab) & real-time synthesis (Max/MSP) =⇒ gestures & sound synthesis choice that simplifies the mapping
5/11
V. Verfaille, O. Quek & M. M. Wanderley
Sonification of Performers’ Ancillary Gestures
Choosing Gestures
ancillary gestures selected from a thorough analysis of videos w/ Laban-Bartenieff [Campbell, Chagnon & Wanderley, 2005] 1) 2) 3) 4)
the circular movement of clarinet bell the body weight transfer the body curvature the knee bend
low-frequency & continuous motions (infra-sound signals) =⇒ need for continuous-like (non-percussive) and non-annoying sounds [Pauletto & Hunt 2006 (ICAD)]
6/11
V. Verfaille, O. Quek & M. M. Wanderley
Sonification of Performers’ Ancillary Gestures
Choosing Sound Synthesis sound synthesis creates a ‘good’ auditory scene: potentially mixed with the clarinet sound =⇒ need for ‘non-intrusive’ sounds identifiability of 1 sonification from the other ones =⇒ simultaneously hear 4 gestures (perceptual attributes differ in behavior)
resulting in ‘lucky’ intuitive choices: 3 6= pitched sounds (1 varies) ⇐⇒ 1 noisy sound amplitude morphing (sounds appear/disappear) timbre differences (pure tone, harmonic pattern, noise); 2 constant brightness ⇐⇒ 2 varying brightness panning of sounds related to L/R or F/B motions
7/11
V. Verfaille, O. Quek & M. M. Wanderley
Sonification of Performers’ Ancillary Gestures
8/11
Mapping Gestures to Sound Synthesis Mapping design: simple, 1-to-1, explicit =⇒ to create an ‘efficient’ auditory scene. Signal conditioning used to extract the information (low-pass filtering, absolute derivative, etc.) Gesture
Synthesis
F (t)
circular motions angle distance
∞ glissandi
∈ [100, 2000] Hz chroma
body weight transfer left / right forward / backward distance
AM
body curvature curvature
FM
knee bending proj. distance
low-pass filter
a(t)
θ(t)
b(t)
√
—
√
—
√
—
√
√
—
√
√
440 Hz rate √ 318 Hz
< 200 Hz
V. Verfaille, O. Quek & M. M. Wanderley
Sonification of Performers’ Ancillary Gestures
Video Example
Sonifications Max/MSP realtime demo – Stravinsky’s piece with 4 gestures
9/11
V. Verfaille, O. Quek & M. M. Wanderley
Sonification of Performers’ Ancillary Gestures
Observations / Next Steps Observations after a pilot experiment: sonification = complementary tool to identify / qualitatively analyse ancillary gestures sonifying up to 4 gestures can be heard ± clearly (mapping-depending) Next steps: 1) real-time sonification: signal conditioning in Max/MSP instead of Matlab 2) interactive sonification: user control of sonification parameters, eg. markers / mapping / synthesis techniques / mixing choice
10/11
V. Verfaille, O. Quek & M. M. Wanderley
Sonification of Performers’ Ancillary Gestures
Next Steps / Applications Next steps: 3) a formal experiment to test if sonification can help to: ‘hear’ more information ⇐⇒ see on videos or markers? differentiate performers, expressive manners? Expressiveness Immobile Standard Exaggerated
Performer 1 2 3 ×
×
× √ × √
=⇒ 5 sonifications × with identical video AB pair comparison of 5 sonif., for 8 x 3 excerpts
Other applications: gesture learning / training (using audio feedback) DMI design: enhanced gestural control composition & performance: sonification with a clarinet synthesizer Ssynth or by control of DAFx
11/11
Sonification of Performers’ Ancillary Gestures Vincent Verfaille, Oswald Quek & Marcelo M. Wanderley IDMIL & SPCL
C I R MM T
Centre for Inderdisciplinary Research in Music Media and Technology
[email protected]
Sonification of Performers’ Ancillary Gestures
About Ancillary Gestures
M. M. Wanderley, Ph. Depalle, and O. Warusfel, “Improving instrumental sound synthesis by modeling the effect of performer gestures,” in Proc. Int. Comp. Music Conf., 1999, pp. 418–21. M. M. Wanderley, B. W. Vines, N. Middleton, C. McKay, and W. Hatch, “Expressive movements of clarinetists: Quantification and musical considerations,” Tech. Rep., MT2004-IDIM01, IDMIL, McGill, Oct. 2004. L. Campbell, M.-J. Chagnon, and M. M. Wanderley, “On the use of laban-bartenieff techniques to describe ancillary gestures of clarinetists,” Tech. Rep., IDMIL, McGill, 2005.
13/11
Sonification of Performers’ Ancillary Gestures
About Sonification
R. Shepard, “Circularity in judgments of relative pitch,” Journal of the Acoustical Society of America, vol. 36, no. 12, pp. 2346–53, 1964. J. C. Risset, “Pitch control and pitch paradoxes demonstrated with computer-synthesized sounds,” Jour. Ac. Soc. of Am., vol. 46, no. (A), pp. 88, 1969. J. Chowning, “The synthesis of complex audio spectra by means of frequency modulation,” J. Audio Eng. Soc., vol. 21, pp. 526–34, 1971. S. Barrass and G. Kramer, “Using sonification,” Multimedia Systems, vol. 7, pp. 23–31, 1999. ¨ T. Hermann, O. Honer, and H. Ritter, “Acoumotion - an interactive sonification system for acoustic motion control,” in Gesture Workshop, 2005. V. Verfaille, O. Quek, and M. M. Wanderley, “Sonification of musicians’ ancillary gestures,” in ENACTIVE Workshop, Montreal, May 2006.
14/11