603
Views
10
CrossRef citations to date
0
Altmetric
Original Articles

Modelling audiovisual integration of affect from videos and music

, , , &
Pages 516-529 | Received 05 Dec 2016, Accepted 13 Apr 2017, Published online: 02 May 2017

References

  • Anders, S., Lotze, M., Erb, M., Grodd, W., & Birbaumer, N. (2004). Brain activity underlying emotional valence and arousal: A response-related fMRI study. Human Brain Mapping, 23(4), 200–209. doi: 10.1002/hbm.20048
  • Anderson, N. H. (1981). Foundations of information integration theory. New York, NY: Academic Press.
  • Anderson, N. H. (2014). A functional theory of cognition. New York, NY: Psychology Press.
  • Balconi, M., & Vanutelli, M. E. (2016a). Hemodynamic (fNIRS) and EEG (N200) correlates of emotional inter-species interactions modulated by visual and auditory stimulation. Scientific Reports, 6, 1–11. doi: 10.1038/s41598-016-0001-8
  • Balconi, M., & Vanutelli, M. E. (2016b). Vocal and visual stimulation, congruence and lateralization affect brain oscillations in interspecies emotional positive and negative interactions. Social Neuroscience, 11(3), 297–310. doi: 10.1080/17470919.2015.1081400
  • Baucom, L. B., Wedell, D. H., Wang, J., Blitzer, D. N., & Shinkareva, S. V. (2012). Decoding the neural representation of affective states. NeuroImage, 59(1), 718–727. doi: 10.1016/j.neuroimage.2011.07.037
  • Baumeister, R. F., Bratslavsky, E., Finkenauer, C., & Vohs, K. D. (2001). Bad is stronger than good. Review of General Psychology, 5(4), 323–370. doi: 10.1037/1089-2680.5.4.323
  • Baumgartner, T., Esslen, M., & Jäncke, L. (2006). From emotion perception to emotion experience: Emotions evoked by pictures and classical music. International Journal of Psychophysiology, 60(1), 34–43. doi: 10.1016/j.ijpsycho.2005.04.007
  • Collignon, O., Girard, S., Gosselin, F., Roy, S., Saint-Amour, D., Lassonde, M., & Lepore, F. (2008). Audio-visual integration of emotion expression. Brain Research, 1242, 126–135. doi: 10.1016/j.brainres.2008.04.023
  • de Gelder, B., & Vroomen, J. (2000). The perception of emotions by ear and by eye. Cognition & Emotion, 14(3), 289–311. doi: 10.1080/026999300378824
  • Deng, Y., Chang, L., Yang, M., Huo, M., Zhou, R., & Eder, A. B (2016). Gender differences in emotional response: Inconsistency between experience and expressivity. PloS One, 11(6), e0158666. doi: 10.1371/journal.pone.0158666
  • Eerola, T., & Vuoskoski, J. K. (2011). A comparison of the discrete and dimensional models of emotion in music. Psychology of Music, 39(1), 18–49. doi: 10.1177/0305735610362821
  • Ellis, R. J., & Simons, R. F. (2005). The impact of music on subjective and physiological indices of emotion while viewing films. Psychomusicology: A Journal of Research in Music Cognition, 19(1), 15–40. doi: 10.1037/h0094042
  • Ethofer, T., Pourtois, G., & Wildgruber, D. (2006). Investigating audiovisual integration of emotional signals in the human brain. Progress in Brain Research, 156, 345–361. doi: 10.1016/S0079-6123(06)56019-4
  • Föcker, J., Gondan, M., & Röder, B. (2011). Preattentive processing of audio-visual emotional signals. Acta Psychologica, 137(1), 36–47. doi: 10.1016/j.actpsy.2011.02.004
  • Gerdes, A., Wieser, M. J., & Alpers, G. W. (2014). Emotional pictures and sounds: A review of multimodal interactions of emotion cues in multiple domains. Frontiers in Psychology, 5, 1–13. doi: 10.3389/fpsyg.2014.01351
  • Gerdes, A., Wieser, M. J., Bublatzky, F., Kusay, A., Plichta, M. M., & Alpers, G. W. (2013). Emotional sounds modulate early neural processing of emotional pictures. Frontiers in Psychology, 4, 1–12. doi: 10.3389/fpsyg.2013.00741
  • Jeong, J., Diwadkar, V. A., Chugani, C. D., Sinsoongsud, P., Muzik, O., Behen, M. E., … Chugani, D. C. (2011). Congruence of happy and sad emotion in music and faces modifies cortical audiovisual activation. NeuroImage, 54(4), 2973–2982. doi: 10.1016/j.neuroimage.2010.11.017
  • Jessen, S., & Kotz, S. A. (2011). The temporal dynamics of processing emotions from vocal, facial, and bodily expressions. Neuroimage, 58(2), 665–674. doi: 10.1016/j.neuroimage.2011.06.035
  • Kim, J., Shinkareva, S. V., & Wedell, D. H. (2017). Representations of modality-general valence for videos and music derived from fMRI data. NeuroImage, 148, 42–54. doi: 10.1016/j.neuroimage.2017.01.002
  • Kim, J., Wang, J., Wedell, D. H., & Shinkareva, S. V. (2016). Identifying core affect in individuals from fMRI responses to dynamic naturalistic audiovisual stimuli. PLoS ONE, 11(9), 1–21.
  • Kim, J., & Wedell, D. H. (2016). Comparison of physiological responses to affect eliciting pictures and music. International Journal of Psychophysiology, 101, 9–17. doi: 10.1016/j.ijpsycho.2015.12.011
  • Klasen, M., Chen, Y., & Mathiak, K. (2012). Multisensory emotions: Perception, combination and underlying neural processes. Reviews Neuroscience, 23(4), 381–392. doi: 10.1515/revneuro-2012-0040
  • Klasen, M., Kenworthy, C. A., Mathiak, K. A., Kircher, T. T. J., & Mathiak, K. (2011). Supramodal representation of emotions. The Journal of Neuroscience, 31(38), 13635–13643. doi: 10.1523/JNEUROSCI.2833-11.2011
  • Kokinous, J., Kotz, S. A., Tavano, A., & Schröger, E. (2015). The role of emotion in dynamic audiovisual integration of faces and voices. Social Cognitive and Affective Neuroscience, 10(5), 713–720. doi: 10.1093/scan/nsu105
  • Kuppens, P., Tuerlinckx, F., Russell, J. A., & Barrett, L. F. (2013). The relation between valence and arousal in subjective experience. Psychological Bulletin, 139(4), 917–940. doi: 10.1037/a0030811
  • Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (1997). Motivated attention: Affect, activation, and action. In P. J. Lang, R. F. Simons, & M. T. Balaban (Eds.), Attention and orienting: sensory and motivational processes (pp. 97–135). Hillsdale NJ: Erlbaum.
  • Lindquist, K. A. (2013). Emotions emerge from more basic psychological ingredients: A modern psychological constructionist model. Emotion Review, 5(4), 356–368. doi: 10.1177/1754073913489750
  • Lindquist, K. A., Satpute, A. B., Wager, T. D., Weber, J., & Barrett, L. F. (2016). The brain basis of positive and negative affect: Evidence from a meta-analysis of the human neuroimaging literature. Cerebral Cortex, 26, 1910–1922. doi: 10.1093/cercor/bhv001
  • Marin, M. M., Gingras, B., & Bhattacharya, J. (2012). Crossmodal transfer of arousal, but not pleasantness, from the musical to the visual domain. Emotion, 12(3), 618–631. doi: 10.1037/a0025020
  • Muller, V. I., Habel, U., Derntl, B., Schneider, F., Zilles, K., Turetsky, B. I., & Eickhoff, S. B. (2011). Incongruence effects in crossmodal emotional integration. NeuroImage, 54(3), 2257–2266. doi: 10.1016/j.neuroimage.2010.10.047
  • Paulmann, S., & Pell, M. D. (2011). Is there an advantage for recognizing multi-modal emotional stimuli? Motivation and Emotion, 35(2), 192–201. doi: 10.1007/s11031-011-9206-0
  • Pavlović, I., & Marković, S. (2011). The effect of music background on the emotional appraisal of film sequences. Psihologija, 44(1), 71–91. doi: 10.2298/PSI1101071P
  • Peelle, J. E., & Sommers, M. S. (2015). Prediction and constraint in audiovisual speech perception. Cortex, 68, 169–181. doi: 10.1016/j.cortex.2015.03.006
  • Petrini, K., McAleer, P., & Pollick, F. (2010). Audiovisual integration of emotional signals from music improvisation does not depend on temporal correspondence. Brain Research, 1323, 139–148. doi: 10.1016/j.brainres.2010.02.012
  • Pourtois, G., de Gelder, B., Vroomen, J., Rossion, B., & Crommelinck, M. (2000). The time-course of intermodal binding between seeing and hearing affective information. Neuroreport, 11(6), 1329–1333. doi: 10.1097/00001756-200004270-00036
  • Prete, G., Marzoli, D., Brancucci, A., Fabri, M., Foschi, N., & Tommasi, L. (2014). The processing of chimeric and dichotic emotional stimuli by connected and disconnected cerebral hemispheres. Behavioural Brain Research, 271, 354–364. doi: 10.1016/j.bbr.2014.06.034
  • Rozin, P., & Royzman, E. B. (2001). Negativity bias, negativity dominance, and contagion. Personality and Social Psychology Review, 5(4), 296–320. doi: 10.1207/S15327957PSPR0504_2
  • Russell, J. A. (2003). Core affect and the psychological construction of emotion. Psychological Review, 110(1), 145–172. doi: 10.1037/0033-295X.110.1.145
  • Russell, J. A. (2005). Emotion in human consciousness is built on core affect. Journal of Consciousness Studies, 12(8–10), 26–42.
  • Shinkareva, S. V., Wang, J., Kim, J., Facciani, M. J., Baucom, L. B., & Wedell, D. H. (2014). Representations of modality-specific affective processing for visual and auditory stimuli derived from fMRI data. Human Brain Mapping, 35(7), 3558–3568. doi: 10.1002/hbm.22421
  • Smith, N. K., Cacioppo, J. T., Larsen, J. T., & Chartrand, T. L. (2003). May I have your attention, please: Electrocortical responses to positive and negative stimuli. Neuropsychologia, 41(2), 171–183. doi: 10.1016/S0028-3932(02)00147-1
  • Spreckelmeyer, K. N., Kutas, M., Urbach, T. P., Altenmüller, E., & Münte, T. F. (2006). Combined perception of emotion in pictures and musical sounds. Brain Research, 1070(1), 160–170. doi: 10.1016/j.brainres.2005.11.075
  • Stekelenburg, J. J., & Vroomen, J. (2007). Neural correlates of multisensory integration of ecologically valid audiovisual events. Journal of Cognitive Neuroscience, 19(12), 1964–1973. doi: 10.1162/jocn.2007.19.12.1964
  • Takagi, S., Hiramatsu, S., Tabei, K., & Tanaka, A. (2015). Multisensory perception of the six basic emotions is modulated by attentional instruction and unattended modality. Frontiers in Integrative Neuroscience, 9, 1–10. doi: 10.3389/fnint.2015.00001
  • Tversky, A., & Kahneman, D. (1991). Loss aversion in riskless choice: A reference-dependent model. The Quarterly Journal of Economics, 106, 1039–1061. doi: 10.2307/2937956
  • Vines, B. W., Krumhansl, C. L., Wanderley, M. M., Dalca, I. M., & Levitin, D. J. (2011). Music to my eyes: Cross-modal interactions in the perception of emotions in musical performance. Cognition, 118(2), 157–170. doi: 10.1016/j.cognition.2010.11.010
  • Vroomen, J., Driver, J., & De Gelder, B. (2001). Is cross-modal integration of emotional expressions independent of attentional resources? Cognitive, Affective, & Behavioral Neuroscience, 1(4), 382–387. doi: 10.3758/CABN.1.4.382
  • Yuan, J., Zhang, Q., Chen, A., Li, H., Wang, Q., Zhuang, Z., & Jia, S. (2007). Are we sensitive to valence differences in emotionally negative stimuli? Electrophysiological evidence from an ERP study. Neuropsychologia, 45(12), 2764–2771. doi: 10.1016/j.neuropsychologia.2007.04.018
  • Yuen, K. S. L., Johnston, S. J., De Martino, F., Sorger, B., Formisano, E., Linden, D. E. J., & Goebel, R. (2012). Pattern classification predicts individuals’ responses to affective stimuli. Translational Neuroscience, 3(3), 278–287. doi: 10.2478/s13380-012-0029-6

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.