Sonification of Musicians' Ancillary Gestures - Vincent Verfaille

Sonification of Musicians' Ancillary Gestures. Bibliography. Sonifying circular motions of the clarinet bell. Problem? short sounds appear without circular motions:.
762KB taille 1 téléchargements 332 vues
Sonification of Musicians’ Ancillary Gestures Vincent Verfaille, Oswald Quek & Marcelo M. Wanderley Input Devices and Music Interaction Laboratory Schulich School of Music – McGill University

May 25-27 2006

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Problem

Problem Ancillary gestures: not directly related to sound production. To analyze gestures, we: 1

define which kind of gestures are studied (ancillary)

2

use sensors to quantify the different kinds of movements

3

present such information using visual methods (e.g. graphs or tables)

4

perform some analyzis

... but monitoring four 3D-curves at once is not easy! Any other solution?

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Problem

Sonification Sonification = “use of nonspeech audio to convey information”; “transformation of data relations into perceived relations in an acoustic signal for the purposes of facilitating communication or interpretation” [Barras & Kramer, 1999] helps to reveal structures in data that are not obvious in visual-only analysis [Pauletto & Hunt, 2004] represents data sets with large numbers of changing variables / temporally complex information (blurred or missed by visual displays) frees the listener’s cognitive load & enables to focus on data’s important aspects [Barras & Kramer, 1999]

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Problem

Enactive?

Is there a link with Enactive interfaces? A traditional instrument: is an enactive interface (sound, haptics, visual) is controlled by gestures adding a sound feedback for gestures that dot not produce the sound!

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Problem

Study configuration

3 clarinettists performing N times: Stravinsky’s Three Pieces for Solo Clarinet 3 expressiveness manners: normal, immobile and exaggerated [Wanderley et al, 2004] performances recorded with video cameras & Optotrak system 3020 (optical tracker with active infra-red markers) sonifications in ‘semi real-time’: offline pre-processing (Matlab) & real-time synthesis (MaxMSP)

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Problem

Building the sonification system

To build the sonification system, we focused on: the selection of gestures the choice appropriated synthesis techniques building adequate mappings between gesture data and synthesis techniques =⇒ try to build an efficient auditory scene

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Choosing gestures & sound synthesis

How did we choose the gestures?

A list of ancillary gestures provided by a thorough analysis of videos w/ Laban-Bartenieff [Campbell, Chagnon & Wanderley, 2005] helped to choose the 4 following ones: the circular movement of clarinet bell the body weight transfer the body curvature the knee bend of a musician

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Choosing gestures & sound synthesis

How did we choose the synthesis techniques?

Criteria: unique features: 1 sonification is identifiable from the rest =⇒ be able to simultaneously hear 4 different gestures unique variation type: perceptual attributes have different behavior from one synthesis to another unique frequency range create an auditory scene =⇒ can be mixed with the clarinet sound

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Choosing gestures & sound synthesis

Which synthesis technique for which gesture?

Gesture circular motions of the bell body weight transfer body curvature knee bending

Synthesis technique Risset’s infinite glissandi (additive synthesis) beat interference technique (additive synthesis) frequency modulation low-pass filtering of white noise

Table: Linking gestures to sound synthesis techniques

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Mappings

Sonifying circular motions of the clarinet bell Synthesis: Risset’s infinite glissandi [Risset, 1969] (discretization of Shepard’s tones) Mapping: panning: a(Mc (n), M(n)) gain: g(n) = hLP ∗ d(Mc (n), M(n)) ˜(Mc (n),M(n)) d 2a dt 2 [Drobish, 1852; Shepard, 1982]

y M(n-1)

M(n) a(n)

Mc(n)

chroma :

unwrapped phase Sound example

M(n-2)

d(n) M(n-3)

˜ with a M(n-5)

M(n-4)

x

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Mappings

Sonifying weight transfer Synthesis: beat interference (forward/backward) s(n) = cos(2πf0 n) · cos(Φtrem (n)) n X Φtrem (n) = 2πftrem (i) i=0

and panning (left/right) using constant power (Blumlein law): √ 2 xL (n) = (sin θpan + cos θpan ) · s(n) 2 √ 2 xR (n) = (sin θpan − cos θpan ) · s(n) 2 May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Mappings

Sonifying weight transfer Mapping: forward M(n)

ftrem

d(n)

left

O(n)

panning

right

backward

pan: θpan (n) = α · M(|xM (n) − xO (n)|) ∈ [− π4 , π4 ] rad gain: g(n) = β · |yM (n) − yO (n)| ∈ [0, 1] tremolo: ftrem (n) = M (d(n)) ∈ [0, 8]Hz M: non linear mapping to emphasize curve behavior Sound example May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Mappings

Sonifying body curvature

Synthesis with frequency modulation [Chowning, 1977] x(n) = a(n) · sin(αn + m(n) sin βn) Mapping: modulation index (brightness): m(n) = α k (n) amplitude: g(n) = hLP ∗ dk(n) dn 40-order low pass filter constant fundamental frequency

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Mappings

Sonifying body curvature Curvature estimated from the 2-order polynom: f (x) = p2 (t) x 2 + p1 (t) x + p0 (t) as 2 p2 (t) f 00 (x) = k (x(t)) = 3/2 2 3/2 0 ([2 p2 (t) x+p1 (t)]2 +1) (1+[f (x)] )

p0 (t), p1 (t), p2 (t) from marker positions M1 (t), M6 (t), M9 (t) Sound example

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Mappings

Sonifying knee movements z

y k(n)

Knee movements: k(n) = projection of the knee to hip distance Synthesis: low pass filtering a white noise Mapping: k (n) −→ cut-off frequency (brightness & amplitude): dk (n) fco (n) = hLP ∗ dn low pass filter: biquad Sound example May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Mappings

Auditory scene

Gesture circular motions weight transfer curvature knee

Synthesis infinite glissandi beat interference FM low-pass filtering

Fund. Freq. ∈ [100, 2000] Hz 440 Hz 318 Hz < 200 Hz

Amp. √ √ √ √

Panning √ √ — —

Table: Elements of the auditory scene

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Mappings

Sonification: an example

Sonifications

Example: 10’ excerpt of the Stravinsky’s piece (4 gestures)

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Future works

Future works: experiment to be conducted tool =⇒ experiments Can sonification help to: ‘listen’ to those gestures ⇐⇒ watching measures? ‘hear’ more information ⇐⇒ see on videos? ‘hear’ more information ⇐⇒ see on sensors’ measures? By hearing the sonification only, can one recognize the performer, the expressiveness manner? Experiment : 8 videos excerpts, each one with 5 different sonifications (time-warping): 3 expressiveness manners for the same performer same expressive manners for the 2 other performers

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Future works

Future Works: improvements

Possible improvements: real time sonification (mapping partly made in Matlab) interactive sonification: real time modification of sonification parameters (sensor choice, mapping, settings) user controls synthesis techniques / mixing (auditory scene)

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Conclusions

Conclusions

From preliminary observations: sonifying up to 4 musicians ancillary gestures can be heard ± clearly sonification = complementary tool to identify and qualitatively analyse musicians ancillary movements next steps: formal experiment & interactive sonifications

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Bibliography

S. Barrass and G. Kramer, “Using sonification,” Multimedia Systems, vol. 7, pp. 23–31, June 1999. L. Campbell, M.-J. Chagnon, and M. M. Wanderley, “On the use of laban-bartenieff techniques to describe ancillary gestures of clarinetists,” Tech. Rep., IDMIL, McGill, 2005. J. Chowning, “The synthesis of complex audio spectra by means of frequency modulation,” Comp. Music Journal, vol. 1, no. 2, pp. 46–54, 1977. ¨ T. Hermann, O. Honer, and H. Ritter, “Acoumotion - an interactive sonification system for acoustic motion control,” in Proc. Int. Gesture Workshop, (GW 2005), Vannes, 2005. J. C. Risset, “Pitch control and pitch paradoxes demonstrated with computer-synthesized sounds,” Jour. Ac. Soc. of Am., vol. 46, no. (A), pp. 88, 1969. M. M. Wanderley, B. W. Vines, N. Middleton, C. McKay, and W. Hatch, “Expressive movements of clarinetists: Quantification and musical considerations,” Tech. Rep., MT2004-IDIM01, IDMIL, McGill, Oct. 2004. M. M. Wanderley, Ph. Depalle, and O. Warusfel, “Improving instrumental sound synthesis by modeling the effect of performer gestures,” in Proc. Int. Comp. Music Conf., 1999, pp. 418–21. May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Bibliography

Sonifying circular motions of the clarinet bell

Figure: Example of the smoothing of amplitude

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Bibliography

Sonifying circular motions of the clarinet bell

Problem? short sounds appear without circular motions:

in fact, we hear beginnings of circular motions!

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Bibliography

Sonifying weight transfer M: non linear mapping to emphasize curve behavior

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Bibliography

Sonifying weight transfer

difficulties: choice of M(n) (mean between two markers) and O(n) (initial position or global mean position?)

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Bibliography

Sonifying body curvature Synthesis with frequency modulation [Chowning, 1977] Mapping: modulation index (brightness): m(n) = α k (n) amplitude: g(n) = hLP ∗ dk(n) dn 40-order low pass filter constant fundamental frequency Curvature estimated with 2 models: radius of the circle passing by 3 points (fast, not accurate) curvature of a 2-order polynom (slower, more accurate)

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Bibliography

Sonifying body curvature 2-order polynom: f (x) = p2 (t) x 2 + p1 (t) x + p0 (t) f 00 (x) 2 p2 (t) curvature: k (x(t)) = 3/2 = 3/2 2 0 (x)]2 1+[f ([2 p (t) x+p ( ) 2 1 (t)] +1)

p0 (t), p1 (t), p2 (t) computed from marker positions M1 (t), M6 (t), M9 (t)

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Bibliography

Sonifying body curvature Exemples with 3 performers; curvature measured at the shoulder:

May 25-27, 2006 — McGill University

Sonification of Musicians’ Ancillary Gestures Bibliography

Questions

Can sonification help to: ‘listen’ to those gestures ⇐⇒ watching measures? ‘hear’ more information ⇐⇒ see on videos? ‘hear’ more information ⇐⇒ see on sensors’ measures?

May 25-27, 2006 — McGill University