Abstract
This article presents PLXTRM, a system tracking picking-hand micro-gestures for real-time music applications and live performance. PLXTRM taps into the existing gesture vocabulary of the guitar player. On the first level, PLXTRM provides a continuous controller that doesn’t require the musician to learn and integrate extrinsic gestures, avoiding additional cognitive load. Beyond the possible musical applications using this continuous control, the second aim is to harness PLXTRM’s predictive power. Using a reservoir network, string onsets are predicted within a certain time frame, based on the spatial trajectory of the guitar pick. In this time frame, manipulations to the audio signal can be introduced, prior to the string actually sounding, ’prefacing’ note onsets. Thirdly, PLXTRM facilitates the distinction of playing features such as up-strokes vs. down-strokes, string selections and the continuous velocity of gestures, and thereby explores new expressive possibilities.
Acknowledgements
The authors would like to thank Johannes Taelman for providing technical support to this project. Taelman is an independent developer and creator of the Axoloti board and community: http://www.axoloti.com/.
Notes
4 Axoloti is a platform that blends sketching of digital audio algorithms with the musical playability of standalone hardware. It is an open source project by an independent Belgian developer. See Section 4.5.