284
Views
1
CrossRef citations to date
0
Altmetric
Original Articles

An Emotion-aware Interactive Concert System: A Case Study in Realtime Physiological Sensor Analysis

&
Pages 261-269 | Received 15 Mar 2016, Accepted 24 May 2017, Published online: 09 Jun 2017

References

  • Collins, N. (2007). Musical robots and listening machines. In N. Collins & J. d’Escriván (Eds.), The Cambridge companion to electronic music (pp. 171–184). Cambridge: Cambridge University Press.
  • Collins, N. (2011). SCMIR: A supercollider music information retrieval library. Proceedings of ICMC2011, International Computer Music Conference, Huddersfield.
  • Eerola, T. (2012). Modeling listeners’ emotional response to music. Topics in Cognitive Science, 4, 607–624.
  • Guyon, I., & Elisseeff, A. (2003). An introduction to variable and feature selection. The Journal of Machine Learning Research, 3, 1157–1182.
  • Janssen, J. H., van den Broek, E. L., & Westerink, J. H. D. M. (2012). Tune in to your emotions: A robust personalized affective music player. User Modeling and User-Adapted Interaction, 22, 255–279.
  • Juslin, P. N., & Sloboda, J. A. (2001). Music and emotion: Theory and research. Oxford: Oxford University Press.
  • Kim, J., & André, E. (2008). Emotion recognition based on physiological changes in music listening. IEEE Transactions on Pattern Analysis and Machine Intelligence, 30, 2067–2083.
  • Kim, Y. E., Schmidt, E. M., Migneco, R., Morton, B. G., Richardson, P., Scott, J., … Turnbull, D. (2010). Music emotion recognition: A state of the art review. Proceedings of ISMIR, Utrecht.
  • Miranda, E. R., & Wanderley, M. M. (2006). New digital musical instruments: Control and interaction beyond the keyboard. Middleton, WI: A-R Editions.
  • Picard, R. W. (1997). Affective computing. Cambridge, MA: MIT Press.
  • Picard, R. W., Vyzas, E., & Healey, J. (2001). Toward machine emotional intelligence: Analysis of affective physiological state. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 23, 1175–1191.
  • Rowe, R. (2001). Machine musicianship. Cambridge MA: MIT Press.
  • Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39, 1161–1178.
  • Thorogood, M., & Pasquier, P. (2013). Impress: A machine learning approach to soundscape affect classification for a music performance environment. Proceedings of NIME, Daejeon, South Korea.
  • van den Broek, E. L., Lisy, V., Westerink, J. H., Schut, M. H., & Tuinenbreijer, K. (2009). Biosignals as an advanced man-machine interface. (Research report IS15-IS24).
  • van ’t Klooster, A. R. (2011). Balancing art and technology: The aesthetic experience in body and technology mediated artworks (PhD thesis). Sunderland University.
  • van ’tx2 Klooster, A. (2016). Creating emotion-sensitive interactive artworks: Three case studies. Leonardo, 7–8. doi:10.1162/LEON_a_01344
  • Vermeulen, V. (2014). Affective computing, biofeedback and psychophysiology as new ways for interactive music composition and performance. eContact!, 16. Retrieved from http://cec.sonus.ca/econtact/16_3/vermeulen_affectivecomputing.html
  • Wilson, S. (2002). Information arts: Intersections of science, art and technology. Cambridge, MA: MIT Press.
  • Wright, M. (2005). Open sound control: An enabling technology for musical networking. Organised Sound, 10, 193–200.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.