266
Views
1
CrossRef citations to date
0
Altmetric
Articles

PLXTRM: Prediction-Led eXtended-guitar Tool for Real-time Music applications and live performance

, , , &
Pages 187-200 | Received 22 Jun 2016, Accepted 11 Jan 2017, Published online: 13 Feb 2017

References

  • Bressan, F. (2014). The preservation and restoration of systems for automatic music performance. Canazza, S., & Rodà, A. eds. Proceedings of the 1st International Workshop on Computer and Robotic Systems for Automatic Music Performance (SAMP14) in conjunction with the 13th International Conference on Intelligent Autonomous Systems (IAS) (pp. 1–8). Venezia (Italy).
  • Craenen. (2012). Instruments for new ears. World new music magazine, 22:90–99.
  • Essl, G., & O’modhrain, S. (2006). An enactive approach to the design of new tangible musical instruments. Organised Sound, 11, 285–296.
  • Fala, J., Keshap, A., Doerning, M., & Barbeau, J. (1991). Note sensing in m.i.d.i. guitars and the like. US Patent 5,033,353.
  • Fernando, C., & Sojakka, S. (2003). Pattern recognition in a bucket. Advances in artificial life (pp. 588–597). Berlin: Springer
  • French, R. M. (2008). Engineering the guitar: theory and practice. Berlin:Springer Science & Business Media.
  • Godøy, R. I., & Leman, M. (2010). Musical gestures: Sound, movement, and meaning. New York: Routledge.
  • Holm, P., & Williams, B. (2014). Electronic guitar pick and method. US Patent App. 14/092,709.
  • Hoppe, D., Brandmeyer, A., Sadakata, M., Timmers, R., & Desain, P. (2006). The effect of real-time visual feedback on the training of expressive performance skills. 9th International Conference on Music Perception and Cognition (ICMPC9) The Society for Music Perception & Cognition (SMPC) and European Society for the Cognitive Sciences of Music (ESCOM), Bologna, Italy.
  • Howard, D. M., Welch, G. F., Brereton, J., Himonides, E., DeCosta, M., Williams, J., & Howard, A. W. (2004). Winsingad: A real-time display for the singing studio*. Logopedics Phonatrics Vocology, 29, 135–144.
  • Keebler, J. R., Wiltshire, T. J., Smith, D. C., Fiore, S. M., & Bedwell, J. S. (2014). Shifting the paradigm of music instruction: implications of embodiment stemming from an augmented reality guitar learning system. Frontiers in psychology, 5:1–18.
  • Kessous, L., Castet, J., & Arfib, D. (2006). ‘gxtar’, an interface using guitar techniques. Proceedings of the 2006 conference on New interfaces for musical expression (pp. 192–195). , Paris, France: IRCAM–Centre Pompidou.
  • Lähdeoja, O. (2015). An augmented guitar with active acoustics. SMC Proceedings of the Sound and Music Computing Conference. Retrieved from http://smcnetwork.org/node/1888
  • Lähdeoja, O., Wanderley, M., & Malloch, J. (2009). Instrument augmentation using ancillary gestures for subtle sonic effects. Proceedings of the SMC 2009–6th Sound and Music Computing Conference (pp. 327–330). Porto (Portugal).
  • Larger, L., Soriano, M. C., Brunner, D., Appeltant, L., Gutiérrez, J. M., Pesquera, L., Mirasso, C. R., & Fischer, I. (2012). Photonic information processing beyond turing: an optoelectronic implementation of reservoir computing. Optics Express, 20, 3241–3249.
  • Larsen, J. V., Overholt, D., & Moeslund, T. B. (2013). The actuated guitar: A platform enabling alternative interaction methods. SMC Proceedings of the Sound and Music Computing Conference (pp. 235–238). Logos Verlag Berlin, Stockholm, Sweden..
  • Leman, M. (2007). Embodied music cognition and mediation technology. Cambridge, MA: The MIT Press.
  • Leman, M. (2016). The expressive moment: how music interaction shapes empowerment. Cambridge, MA: The MIT Press.
  • Lukoševičius, M. & Jaeger, H. (2009). Reservoir computing approaches to recurrent neural network training. Computer Science Review, 3, 127–149.
  • Maes, P.-J., Leman, M., Palmer, C., & Wanderley, M. M. (2014). Action-based effects on music perception. Frontiers in Psychology, 4, 1–14.
  • Maes, P.-J., Nijs, L., & Leman, M. (2015). A conceptual framework for music-based interaction systems. In R. Bader (Ed.), Springer Handbook in Systematic Musicology. Berlin Heidelberg: Springer.
  • McPherson, A. P., Jack, R. H., & Moro, G. (2016). Action-sound latency: Are our tools fast enough?. Proceedings of the international conference on new interfaces for musical expression (NIME 2016) (Vol. 16, pp. 2220–4806). Queensland Conservatorium Griffith University. 20--25, Brisbane (Australia).
  • Miranda, E. R., & Wanderley, M. M. (2006). New digital musical instruments: control and interaction beyond the keyboard (Vol. 21). Middleton, WI: AR Editions Inc.
  • Moens, B., & Leman, M. (2015). Alignment strategies for the entrainment of music and movement rhythms. Annals of the New York Academy of Sciences, 1337, 86–93.
  • Moens, B., Muller, C., van Noorden, L., Franěk, M., Celie, B., Boone, J., Bourgois, J., & Leman, M. (2014). Encouraging spontaneous synchronisation with D-Jogger, an adaptive music player that aligns movement and music. PLoS ONE, 9(12), 1–40.
  • Moens, B., van Noorden, L., Leman, M., Nordahl, R., Fontana, F., & Brewster, S. (2010). D-jogger: A multimodal music interface for music selection based on user step frequency. Proceedings of the Haptic and Audio Interaction Design Conference (pp. 2). Barcelona: Universidad Pompeu Fabra
  • Oda, R., Finkelstein, A., & Fiebrink, R. (2013). Towards note-level prediction for networked music performance. NIME. 2, 94-97.
  • Pakarinen, J., & Puputti, T. (2008). Slide guitar synthesizer with gestural control. Proceedings of 8th International Conference on New Interfaces for Musical Expression (NIME08) (pp. 5-7), Genoa, Italy.
  • Paquot, Y., Duport, F., Smerieri, A., Dambre, J., Schrauwen, B., Haelterman, M., & Massar, S. (2012). Optoelectronic reservoir computing. Scientific Reports, 2, 1–15.
  • Perng, C., Smith, J., & Rossing, T. (2011). Harpsichord sound synthesis using a physical plectrum model interfaced with the digital waveguide. International Conference on Digital Audio Effects (DAFx-11), IRCAM – Centre Pompidou, Paris, France .
  • Polson, R. (1982). Digital high speed guitar synthesizer. US Patent 4,336,734.
  • Schrauwen, B., Verstraeten, D., & Van Campenhout, J. (2007). An overview of reservoir computing: Theory, applications and implementations. Proceedings of the 15th European symposium on artificial neural networks. Bruges, Belgium: Citeseer.
  • Thorpe, C. W. (2002). Visual feedback of acoustic voice features in voice training. Esitelmä pidetty Australian kansainvälisessä puhetieteen ja-teknologian konferenssissa. Retrieved from https://www.singandsee.com/articles/SST2002-Thorpe098.pdf
  • Trail, S., Dean, M., Odowichuk, G., Tavares, T. F., Driessen, P. F., Schloss, W. A., & Tzanetakis, G. (2012). Non-invasive sensing and gesture control for pitched percussion hyper-instruments using the kinect. NIME. Retrieved from http://vhosts.eecs.umich.edu/nime2012/Proceedings/NIME2012WebProceedings.html
  • Vanegas, R. (2007). The midi pick: Trigger serial data, samples, and midi from a guitar pick. Proceedings of the 7th International Conference on New Interfaces for Musical Expression (pp. 330–332). New York: ACM.
  • von dem Knesebeck, A., & Zölzer, U. (2010). Comparison of pitch trackers for real-time guitar effects. Proceedings of the 13th International Conference on Digital Audio Effects (DAFx) (pp. 266–269). Graz.
  • Welch, G. F., Howard, D. M., Himonides, E., & Brereton, J. (2005). Real-time feedback in the singing studio: An innovatory action-research project using new voice technology. Music Education Research, 7, 225–249.
  • Xiao, X., Tome, B., & Ishii, H. (2014). Andante: Walking figures on the piano keyboard to visualize musical motion. In B. Caramiaux, K. Tahiroglu, R. Fiebrink, & A. Tanaka (Eds.), Proceedings of the 14th International Conference on New Interfaces for Musical Expression. Goldsmiths: University of London.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.