438
Views
0
CrossRef citations to date
0
Altmetric
Introduction

Introduction to the Special Issue on Music Performance Monitoring

Pages 1-2 | Published online: 16 Apr 2012

Music performance is a fascinating field of investigation at the crossroad of several scientific disciplines such as musicology, acoustics, human and computer sciences. It is not only the subject of specific studies in music research but also a relevant case study for many other fields interested in non-verbal communication, motor control and cognition, among others. Researchers focusing on music performance often grasp paradigms, technologies or theoretical viewpoints coming from other fields, but researchers working in these other fields are similarly often interested in case studies related to music performance. For example, researchers dealing with the simulation of realistic human movements may focus on virtual animated musicians as a case study, while a well-known automobile company develops robots mimicking musicians as a demonstration of possibilities concerning domestic personal robots. Without underestimating the ‘having fun’ side of dealing with music, a more deep-rooted reason for this interest lies in the fact that music performance is probably one of the most demanding and complex activities that humans can be involved in, therefore its study is exceptionally challenging for scientific minds from various horizons. Studies related to music performance belong to a highly multidisciplinary field that can be approached from multiple angles, which makes it particularly alive with a constant renewal of insights, methodologies, applied technologies and point of views.

This special issue originates from a joint workshop between IPEM (Institute for Psychoacoustic and Electronic Music, Ghent, Belgium) and IRCAM (Institut de Recherche et de Coordination Acoustique/Musique, Paris, France) on music performance monitoring. During this workshop, which was held in December 2010 in Ghent, we presented and discussed on-going researches at the two institutes related to music performance, measurements and data analysis. The workshop was open to researchers and all kinds of public, but special attention was given to inviting musicians in order to involve them in discussions related to their activity.

Monitoring musical performance basically relates to the observation, and the recording, of multiple aspects of musical performances. From a very wide perspective, this includes acoustics features deduced from the sound, movements producing the sound or accompanying the performance (expressive gestures), as well as non-directly accessible data such as the internal biological state of the performer (heart beat and physiological variables). The monitoring has traditionally focused on the sound, for example to give feedback in pedagogical contexts, to follow automatically a score in contemporary music, or to characterize specificities of different performances in studies related to musical expressivity (loudness, timing). Over the last few years, however, the development of sensing technologies and their availability for measurements and real-time applications have opened up new fields of investigation: researchers not only can study performances from an acoustical viewpoint but they also can access and quantify corporeal features creating the sound or accompanying the sound production. This also raised new issues related to data handling, data analyses, development of sensing devices and ecological validity of measurements.

The papers included in this special issue illustrate different trends and responses to these new issues. The gesture monitoring depends of course on the musical instrument being played, and the way data are dealt with depends on the final aim of the studies. However, all the papers relate to gesture and movements in music performance in the sense that they all take the measurements of performing musicians as a basis. The five first papers are all related to modelling and data analysis, while the three last papers present more concrete application studies using performance monitoring as a basis for development or interaction.

The paper ‘Towards a conceptual framework for exploring and modelling expressive musical gestures’ by Rasamimanana focuses on bowing parameters in violin playing and proposes to encompass different approaches (feature based approach, dynamic modelling and segmental approach) in a single framework accounting for the different constraints that a player must manage in order to master the playing of the instrument. The core question is to know how acoustical constraints related to the instrument, and biomechanical constraints related to the player may explain the restricted set of gestures that the player uses. A special emphasis is put on the possible applications in electronic music in order to improve the sense of mediation.

The paper ‘Segmenting and parsing instrumentalists' gestures’ by Caramiaux, Wanderley, and Bevilacqua presents a segmental approach for modelling musician movements, focusing on the specific case of ancillary gestures in clarinet playing (i.e. gestures that are not directly producing the sound). An analysis algorithm is presented, which permits one to model and segment the original movement data as a sequence of primitive shapes (defined as a dictionary). The influence of the dictionary size is discussed in relation to possible applications for modelling and segmentation. In particular a distinction is made between individual shapes recognized in the data stream, and patterns or sequences of shapes that can be analysed on a larger time scale.

The next paper, entitled ‘Assessing a clarinet player's performer gestures in relation to locally intended musical targets’, by Desmet, Nijs, Demey, Lesaffre, Martens, and Leman also deals with ancillary gestures, or expressive gestures, from a rather different point of view. The focus is here on the relationship between musical targets or musical motifs and expressive movements of the player. First an analysis process is proposed in order to segment data of different body parts measured with an optical motion capture system and an attempt is made to classify the extracted gestural segments by similarity (bottom-up approach). This approach is then complemented by a top-down analysis relying on the player's annotations of the score and video recording in terms of expressive intentions, which permits one to highlight the intimate relation between musical intentions and bodily gestures.

The paper ‘Measures of facial muscle activation, intra-oral pressure and mouthpiece force in trumpet playing’ by Bianco, Freour, Cossette, Bevilacqua, and Caussé presents an experiment in which several key parameters of trumpet playing are simultaneously measured and analysed. The study particularly focuses on the temporal evolution of control parameters and their interrelation for three dynamics in two different situations, isolated and concatenated tones, revealing different underlying gestural strategies and possible coarticulation effects.

The paper ‘The influence of an audience on performers: A comparison between rehearsal and concert using audio, video and movement data’, by Moelants, Demey, Grachten, Wu, and Leman, focuses on the analysis of video, audio and sensors' data of an ensemble (singer and viola de gamba) in order to characterize the differences between two performance contexts (general rehearsal and concert). In particular, they show that differences in tempi, postures, and intensity of arm movements reveal an intensification of the performance due to the interaction with the audience.

The paper ‘The Music Paint Machine: Stimulating self-monitoring through the generation of creative visual output using a technology-enhanced learning tool’ by Nijs, Moens, Lesaffre, and Leman presents an interactive system in which bodily movements and the sound produced by the musician are used to create a digital painting in real-time. The paper discusses the theoretical foundations for using such a tool in a pedagogical context in order to improve creativity and musical embodiment. Then it presents an experiment aiming at evaluating the didactic relevance of the system.

The paper ‘The augmented string quartet: Experimentsand gesture following’ by Bevilacqua, Baschet, and Lemouton describes the research process during a collaboration with composer Florence Baschet for a mixed acoustic-electronic piece involving a string quartet, gesture following and digital sound processes controlled by players' movements. The sensing system for measuring bow movements is presented, as well as experiments carried out in order to assess the variability of gestures data in prototypical musical phrases. These preliminary data are then used to test and adjust a real-time gesture analysis system based on the Hidden Markov Model.

Finally, the paper ‘Using narrative analysis to map the control parameters of enhanced operatic singing’ by Kochman, Coussement, Moelants, and Leman presents a methodology for designing technologically enhanced performance environments. The method is based on iterative prototyping and video analyses of available performances in order to account for communicative and expressive gestures in the development process. Narrative analysis is applied to the aria Il dolce sono by Donizetti to create a technologically-enhanced performance of the aria.

Each paper has been reviewed by two experts in the field, and we would like to express our gratitude to these anonymous reviewers who made constructive comments in order to improve the quality of the contributions. We would like to thank particularly Alan Marsden for having accepted and supported this Special Issue, and for his availability at every step of the process. Amy Cox also helped with practical matters and gave us assistance all along the editing process.

We hope that the wide range of methodologies and viewpoints that are described and discussed throughout these papers will provide a good overview for readers unfamiliar with the topic, and that researchers working in the field of music performance monitoring will find material for opening up new insights and feeding their own research.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.