476
Views
0
CrossRef citations to date
0
Altmetric
Article for Special Issue Interactive Composition

Integrating Gesture Data in Computer-aided Composition: A Framework for Representation, Processing and Mapping

Pages 87-101 | Received 17 Mar 2016, Accepted 14 Oct 2016, Published online: 22 Dec 2016

References

  • Agon, C., Assayag, G., & Bresson, J. (Eds.). (2006). The OM composer’s book (Vol. 1). Paris: Collection Musique/Sciences, Éditions Delatour France/IRCAM -- Centre Pompidou.
  • Agon, C., Bresson, J., & Stroppa, M. (2011). OMChroma: Compositional control of sound synthesis. Computer Music Journal, 35, 67–83.
  • Arfib, D., Couturier, J. M., Kessous, L., & Verfaille, V. (2002). Strategies of mapping between gesture data and synthesis model parameters using perceptual spaces. Organised Sound, 7, 127–144.
  • Ariza, C. (2005). Navigating the landscape of computer-aided algorithmic composition systems: A definition, seven descriptors and a Lexicon of systems and research. In International Computer Music Conference (pp. 765–772). Barcelona, Spain.
  • Armstrong, D. F., Stokoe, W. C., & Wilcox, S. E. (1995). Gesture and the nature of language. Cambridge: Cambridge University Press.
  • Assayag, G. (1998). Computer assisted composition today. In First Symposium on Music and Computers, Corfu, Greece.
  • Beaudouin-Lafon, M. (2000). Instrumental interaction: An interaction model for designing post-WIMP user interfaces. In SIGCHI Conference on Human Factors in Computing Systems (pp. 446–453). The Hague, Netherlands.
  • Bevilacqua, F., Ridenour, J., & Cuccia, D. J. (2002). 3D motion capture data: Motion analysis and mapping to music. In Workshop/Symposium on Sensing and Input for Media-centric Systems. Santa Barbara, CA, USA.
  • Bevilacqua, F., Schnell, N., Rasamimanana, N., Zamborlin, B., & Guedy, F. (2011). Online gesture analysis and control of audio processing. In J. Solis & K. C. Ng (Eds.), Musical robots and interactive multimodal systems (pp. 127–142). Berlin: Springer.
  • Bresson, J. (2006). Sound processing in OpenMusic. In International Conference on Digital Audio Effects (pp. 325–330). Montreal, Canada.
  • Bresson, J. (2014). Reactive visual programs for computer-aided music composition. In IEEE Symposium on Visual Languages and Human-centric Computing (pp. 141–144). Melbourne, Australia.
  • Bresson, J., & Agon, C. (2004). SDIF sound description data representation and manipulation in computer assisted composition. In International Computer Music Conference (pp. 520–527). Miami, FL, USA.
  • Bresson, J., & Agon, C. (2007). Musical representation of sound in computer-aided composition: A visual programming framework. Journal of New Music Research, 36, 251–266.
  • Bresson, J., Agon, C., & Assayag, G. (Eds.). (2008). The OM composer’s book (Vol. 2). Collection Musique/Sciences, Éditions Delatour France/IRCAM -- Centre Pompidou.
  • Bresson, J., Agon, C., & Assayag, G. (2009). Visual Lisp/CLOS programming in OpenMusic. Higher-Order and Symbolic Computation, 22, 81–111.
  • Bresson, J., Agon, C., & Assayag, G. (2011). OpenMusic: Visual programming environment for music composition, analysis and research. In ACM International Conference on Multimedia (pp. 743–746). Scottsdale, AZ, USA.
  • Bresson, J., & Schumacher, M. (2011). Representation and interchange of sound spatialization data for compositional applications. In International Computer Music Conference, Huddersfield, UK.
  • Bresson, J., Stroppa, M., & Agon, C. (2007). Generation and representation of data and events for the control of sound synthesis. In Sound and Music Computing Conference, Lefkada, Greece.
  • Cadoz, C. (1988). Instrumental gesture and musical composition. In International Computer Music Conference (pp. 1–12). San Francisco, CA.
  • Cadoz, C., & Ramstein, C. (1990). Capture, representation, and ‘composition’ of the instrumental gesture. In International Computer Music Conference (pp. 53–56). Glasgow, Scotland.
  • Cadoz, C., & Wanderley, M. M. (2000). Gesture-music. In M. M. Wanderley & M. Battier (Eds.), Trends in gestural control of music (pp. 71–93). Paris: IRCAM -- Centre Pompidou.
  • Camurri, A., Coletta, P., Varni, G., & Ghisio, S. (2007). Developing multimodal interactive systems with eyesweb XMI. In Conference on New Interfaces for Musical Expression (pp. 305–308). New York, NY, USA.
  • Caramiaux, B., Wanderley, M. M., & Bevilacqua, F. (2012). Segmenting and parsing instrumentalists’ gestures. Journal of New Music Research, 41, 13–29.
  • Carpentier, G., & Bresson, J. (2010). Interacting with symbol, sound, and feature spaces in orchidée, a computer-aided orchestration environment. Computer Music Journal, 34, 10–27.
  • Chadabe, J. (2002). The limitations of mapping as a structural descriptive in electronic instruments. In Conference on New Interfaces for Musical Expression (pp. 197–201). Dublin, Ireland.
  • Coduys, T., & Ferry, G. (2004). IanniX. Aesthetical/symbolic visualisations for hypermedia composition. In Sound and Music Computing Conference (pp. 18–23). Paris, France.
  • Coughlan, T., & Johnson, P. (2006). Interaction in creative tasks: Ideation, representation and evaluation in composition. In SIGCHI Conference on Human Factors in Computing Systems (pp. 531–540). Montreal, Canada.
  • Delalande, F. (1988). The gestures of Gould: Elements for a semiology of musical gesture. In L. Courteau (Ed.), Glenn Gould Pluriel (pp. 85–111). Québec.
  • Doornbusch, P. (2002). Composers’ views on mapping in algorithmic composition. Organised Sound, 7, 145–156.
  • Doornbusch, P. (2010). Mapping in algorithmic composition and related practices (PhD thesis). RMIT University, Victoria.
  • Downie, M. N. (2005). Choreographing the extended agent: Performance graphics for dance theater (PhD thesis). Cambridge, MA: Massachusetts Institute of Technology, School of Architecture and Planning.
  • Einbond, A. (2016). Musique instrumentale concrète: Timbral transcription in ‘What the Blind See’ and ‘Without Words’. In J. Bresson, G. Assayag, & C. Agon (Eds.), The OM composer’s book (Vol. 3, pp. 151–171). Paris: Éditions Delatour France/IRCAM -- Centre Pompidou.
  • Essl, G., & O’Modhrain, S. (2004). Enaction in the context of musical performance, Interdisciplines virtual workshop (by participants inEnactive interfaces Network).
  • Fitch, J., Lazzarini, V., & Yi, S. (2013). Csound6: Old code renewed. In Linux Audio Conference (pp. 69–75). Graz, Austria.
  • Gabriel, R., White, J., & Bobrow, D. (1991). CLOS: Integrating object-oriented and functional programming. Communications of the ACM, 34, 29–38.
  • Garcia, J., Bresson, J., & Carpentier, T. (2015, March). Towards interactive authoring tools for composing spatialization. In IEEE 10th Symposium on 3D User Interfaces (pp. 151–152). Arles, France.
  • Garcia, J., Bresson, J., Schumacher, M., Carpentier, T., & Favory, X. (2015). Tools and applications for interactive-algorithmic control of sound spatialization in OpenMusic. In inSONIC2015, Aesthetics of Spatial Audio in Sound, Music and Sound Art. Karlsruhe, Germany.
  • Garcia, J., Leroux, P., & Bresson, J. (2014). pOM: Linking pen gestures to computer-aided composition processes. In 40th International Computer Music Conference (ICMC) joint with the 11th Sound & Music Computing conference (SMC). Athens, Greece.
  • Garcia, J., Tsandilas, T., & Agon, C. (2012). Interactive paper substrates to support musical creation. In SIGCHI Conference on Human Factors in Computing Systems (pp. 1825–1828). Austin, TX.
  • Geeraerts, D., & Cuyckens, H. (2007). The Oxford handbook of cognitive linguistics. Oxford: Oxford University Press.
  • Godøy, R. I. (2003). Motor-mimetic music cognition. Leonardo, 36, 317–319.
  • Goldstein, M., Studio, S., & Park, M. (1998). Gestural coherence and musical interaction design. IEEE International Conference on Systems, Man, and Cybernetics (Vol. 2, pp. 1076–1079), Montpellier, France.
  • HighC. (2016). Retrieved from http://highc.org
  • Hiller, L., & Isaacson, L. (1958). Musical composition with a high-speed digital computer. Journal of the Audio Engineering Society, 6, 154–160.
  • Hunt, A., & Wanderley, M. M. (2003). Mapping performer parameters to synthesis engines. Organised Sound, 7, 97–108.
  • Jensenius, A. R. (2007, July). Action-sound: Developing methods and tools to study music-related body movement (PhD thesis). Department of Musicology, University of Oslo, Oslo, Norway.
  • Jensenius, A. R., Wanderley, M. M., Godøy, R. I., & Leman, M. (2010). Musical gestures: Concepts and methods in research. In R. I. Godøy & M. Leman (Eds.), Musical gestures: Sound, movement, and meaning (pp. 12–35). New york, NY: Routledge.
  • Koenig, Gottfried M. (1969). Project One: A programme for musical composition. Electronic Music Reports of the Institute of Sonology 2.
  • Lacquaniti, F., Terzuolo, C., & Viviani, P. (1983). The law relating the kinematic and figural aspects of drawing movements. Acta Psychologica, 54, 115–130.
  • Luciani, A., Evrard, M., Couroussé, D., Castagné, N., Cadoz, C., & Florens, J.-L. (2006). A basic gesture and motion format for virtual reality multisensory applications. In Conference on Computer Graphics Theory and Applications (pp. 349–356). Setubal, Portugal.
  • Mackay, W. E., & Fayard, A.-L. (1999). Designing interactive paper: Lessons from three augmented reality projects. In International Workshop on Augmented Reality (pp. 81–90). Bellevue, WA, USA.
  • Malloch, J., Birnbaum, D., Sinyor, E., & Wanderley, M. M. (2006). Towards a new conceptual framework for digital musical instruments. In International Conference on Digital Audio Effects (pp. 49–52). Montreal, Canada.
  • Malloch, J., Hattwick, I., Schumacher, M., Picciacchia, A., Wanderley, M. M., Ferguson, S., \ldots Bassani, P. (2016). Les Gestes/Gestures. Retrieved from http://www.idmil.org/projects/gestes
  • Malloch, J., Sinclair, S., & Schumacher, M. (2016). Digital orchestra toolbox for MaxMSP. Retreived from http://www.idmil.org/software/digital_orchestra_toolbox
  • Malloch, J., & Wanderley, M. M. (2007). The T-Stick: From musical interface to musical instrument. In Conference on New Interfaces for Musical Expression (pp. 66–70). New York, NY, USA.
  • Marino, G., Serra, M., & Raczinski, J. (1993). The UPIC system: Origins and innovations. Perspectives of New Music, 31, 258–269.
  • Mathes, O. (2016). RVBAP = Reverberated VBAP. Retrieved from http://impala.utopia.free.fr/pd/patchs/externals_libs/vbap/rvbap.pdf
  • McGilvray, D. (2007). On the analysis of musical performance by computer (PhD thesis). Glasgow: University of Glasgow.
  • Miranda, E. R., & Wanderley, M. M. (2006). New digital musical instruments: Control and interaction beyond the keyboard (Vol. 21). Middleton, WI: AR Editions.
  • Myhill, J. R. (1978). Some simplifications and improvements in the Stochastic Music Program. Presented at the International Computer Music Conference (pp. 272–317). Illinois, USA.
  • Nymoen, K., Skogstad, S. A. V. D., & Jensenius, A. R. (2011). Soundsaber -- A Motion Capture Instrument. In Conference on New Interfaces for Musical Expression (pp. 312–315). Oslo, Norway.
  • Ramstein, C. (1991). Analyse, Représentation et Traitement du Geste Instrumental [Representation and Processing of the Instrumental Gesture] (PhD thesis). Grenoble: Institut National Polytechnique de Grenoble.
  • Schumacher, M. (2016). Ab-Tasten: Atomic sound modeling with a computer-controlled grand piano. In J. Bresson, G. Assayag, & C. Agon (Eds.), The OM composer’s book (Vol. 3, pp. 341–359). Paris: Éditions Delatour France/IRCAM -- Centre Pompidou.
  • Schumacher, M., & Bresson, J. (2010a). Compositional control of periphonic sound spatialization. In 2nd International Symposium on Ambisonics and Spherical Acoustics. Paris, France.
  • Schumacher, M., & Bresson, J. (2010b). Spatial sound synthesis in computer-aided composition. Organised Sound, 15, 271–289.
  • Van Nort, D., Wanderley, M. M., & Depalle, P. (2014). Mapping control structures for sound synthesis: Functional and topological perspectives. Computer Music Journal, 38, 6–22.
  • Varela, F. J., Thompson, E., & Rosch, E. (1992). The embodied mind: Cognitive science and human experience. Cambridge, MA: The MIT Press.
  • Wanderley, M. M. (2001). Interaction musicien-instrument: Application au contrôle gestuel de la synthèse sonore [Performer-Instrument Interaction: Applications to Gestural Control of Sound Synthesis] (PhD thesis). Paris: Université Paris VI.
  • Wanderley, M. M., & Depalle, P. (2004). Gestural control of sound synthesis. Proceedings of the IEEE, 92, 632–644.
  • Wanderley, M. M., Schnell, N., & Rovan, J. (1998). Escher -- Modeling and performing ‘composed instruments’ in real-time. In IEEE International Conference on Systems, Man and Cybernetics (SMC ’98) (pp. 1080–1084). San Diego, CA, USA.
  • Wanderley, M. M., Vines, B., Middleton, N., Mckay, C., & Hatch, W. (2005). The musical significance of clarinetists’ ancillary gestures: An exploration of the field. Journal of New Music Research, 34, 97–113.
  • Wenger, E. (2016). Metasynth. Retrieved from http://www.uisoftware.com/MetaSynth/index.php
  • Wessel, D., & Wright, M. (2002). Problems and prospects for intimate musical control of computers. Computer Music Journal, 26, 11–22.
  • Wright, M., Chaudhary, A., Freed, A., Wessel, D., Rodet, X., Virolle, D., ... Serra, X. (1998). New applications of the sound description interchange format. In International Computer Music Conference. Ann Arbor, MI.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.