1,045
Views
48
CrossRef citations to date
0
Altmetric
Original Articles

Audio-visual integration of emotional cues in song

, &
Pages 1457-1470 | Received 13 Feb 2007, Published online: 12 Nov 2008

Keep up to date with the latest research on this topic with citation updates for this article.

Read on this site (5)

John R. Taylor & Roger T. Dean. (2021) Influence of a continuous affect ratings task on listening time for unfamiliar art music. Journal of New Music Research 50:3, pages 242-258.
Read now
Trenton C. Johanis & Mark A. Schmuckler. (2021) Investigation of Multisensory Harmonic Priming: Audiovisual Integration in Chord Perception. Auditory Perception & Cognition 4:1-2, pages 74-96.
Read now
Lawrence D. Rosenblum, James W. Dias & Josh Dorsi. (2017) The supramodal brain: implications for auditory perception. Journal of Cognitive Psychology 29:1, pages 65-87.
Read now
Steven R. Livingstone, William F. Thompson, Marcelo M. Wanderley & Caroline Palmer. (2015) Common cues to emotion in the dynamic facial expressions of speech and song. The Quarterly Journal of Experimental Psychology 68:5, pages 952-970.
Read now
Suranga Chandima Nanayakkara, Lonce Wyse, S.H. Ong & ElizabethA. Taylor. (2013) Enhancing Musical Experience for the Hearing-Impaired Using Visual and Haptic Displays. Human–Computer Interaction 28:2, pages 115-160.
Read now

Articles from other publishers (43)

Marco Susino, William Forde Thompson, Emery Schubert & Mary Broughton. (2024) Emotional Responses to Music: The Essential Inclusion of Emotion Adaptability and Situational Context. Empirical Studies of the Arts.
Crossref
Jacob I. Feldman, Alexander Tu, Julie G. Conrad, Wayne Kuang, Pooja Santapuram & Tiffany G. Woynaroski. (2022) The Impact of Singing on Visual and Multisensory Speech Perception in Children on the Autism Spectrum. Multisensory Research 36:1, pages 57-74.
Crossref
Elke B. Lange, Jens Fünderich & Hartmut Grimm. (2022) Multisensory integration of musical emotion perception in singing. Psychological Research 86:7, pages 2099-2114.
Crossref
Pia Hauck, Christoph von Castell & Heiko Hecht. (2022) Crossmodal Correspondence between Music and Ambient Color Is Mediated by Emotion. Multisensory Research 35:5, pages 407-446.
Crossref
Xin Zhou, Ying Wu, Yingcan Zheng, Zilun Xiao & Maoping Zheng. (2021) The mechanism and neural substrate of musical emotions in the audio-visual modality. Psychology of Music 50:3, pages 779-796.
Crossref
Ilana Harris & Mats B. Küssner. (2020) Come on Baby, Light My Fire: Sparking Further Research in Socio-Affective Mechanisms of Music Using Computational Advancements. Frontiers in Psychology 11.
Crossref
Dominik Gall & Marc Erich Latoschik. (2020) Visual angle modulates affective responses to audiovisual stimuli. Computers in Human Behavior 109, pages 106346.
Crossref
Chuanji Gao, Christine E. Weber & Svetlana V. Shinkareva. (2019) The brain basis of audiovisual affective processing: Evidence from a coordinate-based activation likelihood estimation meta-analysis. Cortex 120, pages 66-77.
Crossref
Ulf A. S. Holbrook. (2019) Sound Objects and Spatial Morphologies. Organised Sound 24:1, pages 20-29.
Crossref
Martina Di Mauro, Enrico Toffalini, Massimo Grassi & Karin Petrini. (2018) Effect of Long-Term Music Training on Emotion Perception From Drumming Improvisation. Frontiers in Psychology 9.
Crossref
Georgios Michail & Julian Keil. (2018) High cognitive load enhances the susceptibility to non-speech audiovisual illusions. Scientific Reports 8:1.
Crossref
Steven R. Livingstone & Frank A. Russo. (2018) The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. PLOS ONE 13:5, pages e0196391.
Crossref
Michael Schutz. (2017) Acoustic Constraints and Musical Consequences: Exploring Composers' Use of Cues for Musical Emotion. Frontiers in Psychology 8.
Crossref
Kamil Imbir & Maria Gołąb. (2016) Affective reactions to music: Norms for 120 excerpts of modern and classical music. Psychology of Music 45:3, pages 432-449.
Crossref
George Waddell & Aaron Williamon. (2017) Eye of the Beholder: Stage Entrance Behavior and Facial Expression Affect Continuous Quality Ratings in Music Performance. Frontiers in Psychology 8.
Crossref
Janne Weijkamp & Makiko Sadakata. (2016) Attention to affective audio-visual information: Comparison between musicians and non-musicians. Psychology of Music 45:2, pages 204-215.
Crossref
Laura S. Brown. (2016) The Influence of Music on Facial Emotion Recognition in Children with Autism Spectrum Disorder and Neurotypical Children. Journal of Music Therapy, pages thw017.
Crossref
Helen F. Mitchell & Raymond A. R. MacDonald. (2016) What you see is what you hear: The importance of visual priming in music performer identification. Psychology of Music 44:6, pages 1361-1371.
Crossref
Matthew Poon & Michael Schutz. (2015) Cueing musical emotions: An empirical analysis of 24-piece sets by Bach and Chopin documents parallels with emotional speech. Frontiers in Psychology 6.
Crossref
Lucy M. McGarry, Jaime A. Pineda & Frank A. Russo. (2014) The role of the extended MNS in emotional and nonemotional judgments of human song. Cognitive, Affective, & Behavioral Neuroscience 15:1, pages 32-44.
Crossref
Jonna K. Vuoskoski & Tuomas Eerola. (2013) Extramusical information contributes to emotions induced by music. Psychology of Music 43:2, pages 262-274.
Crossref
Charlotte Krahé, Ulrike Hahn & Kathryn Whitney. (2013) Is seeing (musical) believing? The eye versus the ear in emotional responses to music. Psychology of Music 43:1, pages 140-148.
Crossref
Lena R. Quinto, William F. Thompson, Christian Kroos & Caroline Palmer. (2014) Singing emotionally: a study of pre-production, production, and post-production facial expressions. Frontiers in Psychology 5.
Crossref
Jonna K. Vuoskoski, Marc R. Thompson, Eric F. Clarke & Charles Spence. (2013) Crossmodal interactions in the perception of expressivity in musical performance. Attention, Perception, & Psychophysics 76:2, pages 591-604.
Crossref
Helen F. Mitchell & Raymond A. R. MacDonald. (2012) Listeners as spectators? Audio-visual integration improves music performer identification. Psychology of Music 42:1, pages 112-127.
Crossref
Dennis Reidsma, Mustafa Radha & Anton Nijholt. 2014. Digital Da Vinci. Digital Da Vinci 79 98 .
Roni Y. Granot, Rona Israel-Kolatt, Avi Gilboa & Tsafrir Kolatt. (2013) Accuracy of Pitch Matching Significantly Improved by Live Voice Model. Journal of Voice 27:3, pages 390.e13-390.e20.
Crossref
Lisa P. Chan, Steven R. Livingstone & Frank A. Russo. (2013) Facial Mimicry in Response to Song. Music Perception 30:4, pages 361-367.
Crossref
William Forde Thompson. 2013. The Psychology of Music. The Psychology of Music 107 140 .
William Forde Thompson & Paolo Ammirante. 2012. The Oxford Handbook of Thinking and Reasoning. The Oxford Handbook of Thinking and Reasoning 774 788 .
Andrei C. Miu & Felicia Rodica Balteş. (2012) Empathy Manipulation Impacts Music-Induced Emotions: A Psychophysiological Study on Opera. PLoS ONE 7:1, pages e30618.
Crossref
Geoffrey P. Lantos & Lincoln G. Craton. (2012) A model of consumer response to advertising music. Journal of Consumer Marketing 29:1, pages 22-42.
Crossref
Silke Paulmann, Debra Titone & Marc D. Pell. (2012) How emotional prosody guides your way: Evidence from eye movements. Speech Communication 54:1, pages 92-107.
Crossref
Jennifer Huang & Carol Lynne Krumhansl. (2011) What does seeing the performer add? It depends on musical style, amount of stage behavior, and audience expertise. Musicae Scientiae 15:3, pages 343-364.
Crossref
Ramona Kaiser & Peter E. Keller. (2011) Music’s impact on the visual perception of emotional dyadic interactions. Musicae Scientiae 15:2, pages 270-287.
Crossref
Ramona Kaiser & Peter E. Keller. (2011) Music's Impact on the Visual Perception of Emotional Dyadic Interactions. Musicae Scientiae 15:2, pages 270-287.
Crossref
Bradley W. Vines, Carol L. Krumhansl, Marcelo M. Wanderley, Ioana M. Dalca & Daniel J. Levitin. (2011) Music to my eyes: Cross-modal interactions in the perception of emotions in musical performance. Cognition 118:2, pages 157-170.
Crossref
T Mizumoto, T Otsuka, K Nakadai, T Takahashi, K Komatani, T Ogata & H G Okuno. (2010) Human-robot ensemble between robot thereminist and human percussionist using coupled oscillator model. Human-robot ensemble between robot thereminist and human percussionist using coupled oscillator model.
Lena Quinto, William Forde Thompson, Frank A. Russo & Sandra E. Trehub. (2010) A comparison of the McGurk effect for spoken and sung syllables. Attention, Perception, & Psychophysics 72:6, pages 1450-1454.
Crossref
Alexandra Jesse & Dominic W. Massaro. (2010) Seeing a singer helps comprehension of the song’s lyrics. Psychonomic Bulletin & Review 17:3, pages 323-328.
Crossref
William Forde Thompson, Frank A. Russo & Steven R. Livingstone. (2010) Facial expressions of singers influence perceived pitch relations. Psychonomic Bulletin & Review 17:3, pages 317-322.
Crossref
Karin Petrini, Phil McAleer & Frank Pollick. (2010) Audiovisual integration of emotional signals from music improvisation does not depend on temporal correspondence. Brain Research 1323, pages 139-148.
Crossref
Anna Järvinen-Pasley, Bradley W. Vines, Kiley J. Hill, Anna Yam, Mark Grichanik, Debra Mills, Allan L. Reiss, Julie R. Korenberg & Ursula Bellugi. (2010) Cross-modal influences of affect across social and non-social domains in individuals with Williams syndrome. Neuropsychologia 48:2, pages 456-466.
Crossref

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.