886
Views
0
CrossRef citations to date
0
Altmetric
Articles

Computational systems for music improvisation

ORCID Icon, , ORCID Icon, ORCID Icon, &
Pages 19-36 | Published online: 22 Feb 2018
 

ABSTRACT

Computational music systems that afford improvised creative interaction in real time are often designed for a specific improviser and performance style. As such the field is diverse, fragmented and lacks a coherent framework. Through analysis of examples in the field, we identify key areas of concern in the design of new systems, which we use as categories in the construction of a taxonomy. From our broad overview of the field, we select significant examples to analyse in greater depth. This analysis serves to derive principles that may aid designers scaffold their work on existing innovation. We explore successful evaluation techniques from other fields and describe how they may be applied to iterative design processes for improvisational systems. We hope that by developing a more coherent design and evaluation process, we can support the next generation of improvisational music systems.

Notes on contributors

Toby Gifford is a sound technologist, with a diverse array of research interests under the banner of ‘sonic environments’. His practice utilizes generative algorithms alongside field recording, with application in bioacoustics, aural architecture and computational creativity. He is currently a Research Fellow at SensiLab, Monash University working on the ARC-funded project Improvisational Interfaces.

Shelly Knotts produces live-coded and network music performances and projects which explore aspects of code, data and collaboration in improvisation. She performs and presents her work internationally, and collaborates prolifically with computers and other humans. She is currently a Research Fellow at SensiLab, Monash University working on the ARC-funded project Improvisational Interfaces.

Stefano Kalonaris is a creative technologist, musician and researcher who specializes in Interactive Music Systems, Music Computing and Performance. He holds a PhD from the Sonic Arts Research Centre, Queen’s University Belfast. He has investigated multidisciplinary approaches to networked music performance, as well as improvisational interfaces for human–computer interaction.

Matthew Yee-King is a lecturer at Goldsmiths and a computer music composer, performer and researcher who has released his solo compositions on Warp and Rephlex records. He has a DPhil in Computer Science and Artificial Intelligence from Sussex University, wherein he investigated techniques and applications of automated sound synthesis.

Mark d’Inverno is a Professor of Computer Science at Goldsmiths who has published close to 200 publications including authored books, edited books, book chapters and peer-review journals and conference articles. Mark is also critically acclaimed jazz pianist garnering praise from the BBC, Guardian, Observer and a range of jazz-focused publications.

Jon McCormack is an artist and academic who has worked extensively with generative computation and creativity. He is currently an ARC Future Fellow and Director of Monash University’s SensiLab.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

This work was supported by Australian Research Council [DP160100166].

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 287.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.