Abstract
Research in the field of embodied music cognition has shown the importance of coupled processes of body activity (action) and multimodal representations of these actions (perception) in how music is processed. Technologies in the field of human–computer interaction (HCI) provide excellent means to intervene into, and extend, these coupled action-perception processes. In this article this model is applied to a concrete HCI application, called the “Conducting Master.” The application facilitates multiple users to interact in real time with the system in order to explore and learn how musical meter can be articulated into body movements (i.e., meter-mimicking gestures). Techniques are provided to model and automatically recognize these gestures in order to provide multimodal feedback streams back to the users. These techniques are based on template-based methods that allow approaching meter-mimicking gestures explicitly from a spatiotemporal account. To conclude, some concrete setups are presented in which the functionality of the Conducting Master was evaluated.
Acknowledgments
This work is part of the EmcoMetecca project supported by the Flemish Government (http://www.ipem.ugent.be/EmcoMetecca). We thank Bart Moens for his help with the Java coding, Ivan Schepers and Raven van Noorden for the technical assistance, and the Centre for Speckled Computing (http://www.specknet.org) for providing the Orient Motion Capture system.