1,190
Views
5
CrossRef citations to date
0
Altmetric
Editorial

Interactive Composition: New Steps in Computer Music Research

&

Interaction has always been a factor of performance. Every musical instrument reacts to a performance gesture at an audio level in producing a sound—colored not only by the instrument itself but also by the space in which it is performed—with which the performer interacts. In recent years, we have noted a growing interest in interaction also on a compositional level, spurred to a certain extent by a general musical interest in free improvisation. From that perspective, the idea of composing at the same time as performing, ‘thinking on my feet’ as improvising musician Leroy Jenkins said many years ago, poses questions of what information a score might contain and to what extent it drives the performer. In the context of computer music, the question becomes one of how a computer might be programmed to generate audio as well as compositional information that is flexible enough in its reaction to a performer to qualify as interactive. Interaction, at whatever level, refers to a mutually influential process.

As an idea gaining traction in computer music research centers, it became important to focus formally on the subject of interaction, identify the researchers who are active in the field, and make understood what exactly they are doing. We learned that many composers of computer music are programming interactive processes that can generate musical material on stage in real time, to be used by performers of instruments or computer systems, and that these computer-based processes often contain rich interactive features that are unique to computer possibilities in their ability to trigger complex processes, while, at the same time, impact the large-scale development of a composition. These processes may involve hard challenges in dealing both with complex macro-temporal structures and with the efficient reactivity or feedback required by musical interaction. In short, from the earliest experiments on ‘interactive composing’ Chadabe (Citation1984) to more recent musical software and projects (Puckette Citation2004; Bresson and Giavitto Citation2014, etc.), the range of creativity has become vast, explaining current computer music research and development as exploring the conflicting domains of composition and performance.

As one response to the need for a formal understanding, a brief conference/workshop titled Interactivity in Music Composition and Performance was organized in 2015 at CIRMMT (Centre for Interdisciplinary Research in Music Media and Technology), a research center at McGill University in Montreal.Footnote1 The event was part of IRCAM’s EFFICACe project,Footnote2 a research project focusing on the formal connections between musical computation, time, and interaction in computer-aided composition software, and on the extensions of such software to the fields of real-time digital sound processing, input devices/gestural controllers, or other aspects of compositional or live interactions.

This workshop at CIRMMT featured presentations by John MacCallum (UC Berkeley); choreographer Teoma Naccarato; Thibaut Carpentier and Jérémie Garcia (IRCAM); Marlon Schumacher (McGill University); invited speakers from local academic institutions, including Sandeep Bhagwati, Adam Basanta, Marcello Giordano and their team of the :body:suit:score project (Concordia and McGill Universities), and Marcelo Wanderley (McGill University); and guest speakers Christopher Dobrian (UC Irvine) and Joel Chadabe (New York University).Footnote3 The range of presentations was an excellent overview of diverse approaches to the issue of interactivity framed in the composition vs. performance duality in computer music.

One result of the event was to address the subject from a larger and more formal perspective by publishing it as a special issue in the Journal of New Music Research. Consequently, the call for contribution to this issue was open to external contributors to bring in additional approaches and research. Among the highlighted topics were the notions of reactivity and interaction in compositional processes, ‘off-line’ processing in real-time frameworks, gestural/performance data processing, and human–computer interfaces. A committee of 23 experts reviewed the submissions. Six papers were selected for publication.

In ‘Computer-Aided Composition of Musical Processes’, Bouche et al. introduce a theoretical model for the integration of interactive processes in a computer-aided composition environment. Implemented in the OpenMusic software, this model interleaves musical structure computations in scheduling and rendering processes. Two examples of application are presented: a system of automatic co-improvisation agents, and an interface for the timed control of real-time sound generation processes.

In ‘Authoring and Automatic Verification of Interactive Multimedia Scores’, Arias, Celerier and Desainte-Catherine focus on the i-score multimedia sequencer to address an issue of formal verification with interactive scores. They call upon model-checking techniques applied to a timed automata representation in order to verify the correctness of interactive scores with regard to possible paths generated by choices or non-deterministic behaviors.

The bach: automated composer helper library contributes to the fusion of composition and interactive processes by providing a wide set of compositional tools running in the Max real-time environment. In ‘Extending Bach: A Family of Libraries for Real-time Computer-Assisted Composition in Max’, Ghisi and Agostini present a wide new set of tools that extend the library with high-level compositional features and processes.

In ‘A Visual Framework for Dynamic Mixed Music Notation’, Burloiu, Cont and Poncelet address the issues of authoring and visualization of computer scores created in the Antescofo reactive musical software. They propose a multi-faceted framework to represent the actions that are triggered and computed in real-time during the performance.

In ‘Interactive-Compositional Authoring of Sound Spatialization’, Garcia, Carpentier and Bresson propose a software framework combining compositional and interactive objectives, focusing specifically on the issue of controlling sound spatialization, through the coupling of computer-aided composition technology (OpenMusic) and real-time DSP software (Spat).

In ‘Integrating Gesture Data in Computer-Aided Composition: A Framework for Representation, Processing and Mapping’, Schumacher and Wanderley propose a software framework (also implemented in OpenMusic) centered on the concept of mapping, with the integration of gesture signals and symbolic compositional material in musical software.

Our immediate goal for this collection of articles is to provide a diversity of state-of-the art music research projects based primarily on interactivity with computer systems. By highlighting a number of new concepts, ideas, and challenges in working with interactivity, among them new time paradigms, sequencing, compositional utilities, graphical representation and authoring of interactive processes, we hope to encourage artistic experimentation and foster new ideas in the art of music.

Jean Bresson
UMR STMS: IRCAM/CNRS/UPMC SorbonneUniversit\'es, France
Joel Chadabe
University of Albany NYUSteinhardt Music Technology Program, State University of New York, NY, USA

Additional information

Funding

This work was supported by Sorbonne Universités international research seminars [grant number ANR-11-IDEX-0004-02]; ANR [grant number ANR-13-JS02-004].

Notes

1 The workshop Interactivity in Music Composition and Performance workshop was organised on May 5, 2015, with the support of Sorbonne Universités international research seminars (‘Investissements davenir’ programme, ANR-11-IDEX-0004-02).

2 Extended Framework for ‘In-time’ Computer-Aided Composition (2013–2017) ANR-13-JS02-004, project funded by the French national Research Agency.

References

  • Bresson, J. & Giavitto, J.-L. (2014). A reactive extension of the openmusic visual programming language. Journal of Visual Languages and Computing, 25, 363–375.
  • Chadabe, J. (1984). Interactive composing: An overview. Computer Music Journal, 8, 22–27.
  • Puckette, M. (2004). A divide between ‘compositional’ and ‘performative’ aspects of pd, 1st International Pd Convention. Austria: Graz.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.