242
Views
4
CrossRef citations to date
0
Altmetric
Articles

Affective Communication through Air Jet Stimulation: Evidence from Event-Related Potentials

, , , &

References

  • Ackerley, R., Backlund Wasling, H., Liljencrantz, J., Olausson, H., Johnson, R. D., & Wessberg, J. (2014). Human C-tactile afferents are tuned to the temperature of a skin-stroking caress. Journal Neuroscience, 34(8), 2879–2883.
  • Ackerley, R., Eriksson, E., & Wessberg, J. (2013, Februaru). Ultra-late EEG potential evoked by preferential activation of unmyelinated tactile afferents in human hairy skin. Neuroscience Letters, 535, 62–66.
  • Adolphs, R., Damasio, H., Tranel, D., Cooper, G., & Damasio, A. R. (2000, Apr). A role for somatosensory cortices in the visual recogni- tion of emotion as revealed by three-dimensional lesion mapping. The Journal of Neuroscience: The Official Journal of the Society for Neuroscience, 20(7), 2683–2690.
  • Adolphs, R., Tranel, D., & Damasio, A. R. (2003). Dissociable neural systems for recognizing emotions. Brain Cognition, 52(1), 61–69.
  • Allison, T., Puce, A., & McCarthy, G. (2000). Social perception from visual cues: Role of the STS region. Trends Cognition Sciences, 4(7), 267–278.
  • Ammi, A., Demulier, M., Caillou, V., Gaffary, S., Tsalamlal, Y., Martin, Y., & Tapus, J.-C. (2015). Haptic human-robot affective interaction in a handshaking social protocol. IEEE human-robot interaction conference, Oregon, USA.
  • App, B., McIntosh, D. N., Reed, C. L., & Hertenstein, M. J. (2011, June). Nonverbal channel use in communication of emotion: How may depend on why. Emotion, 11(3), 603–617.
  • Atkinson, A. P., & Adolphs, R. (2011). The neuropsychology of face perception: Beyond simple dissociations and functional selectivity. Philosophy Transactions R Social London B Biologic Sciences, 366(1571), 1726–1738.
  • Bailenson, J. N., Yee, N., Brave, S., Merget, D., & Koslow, D. (2007). Virtual interpersonal touch: Expressing and recognizing emotions through haptic devices. Human-Computer Interactions, 22, 325–353.
  • Balomenos, T., Raouzaiou, A., Ioannou, S., Drosopoulos, A., Karpouzis, K., & Kollias, S. (2005). Emotion analysis in man-machine interaction systems. In S. Bengio & H. Bourlard (Eds.), Machine learning for multimodal interaction. MLMI 2004. Lecture Notes in Computer Science (Vol. 3361, pp. 318–328). Berlin, Heidelberg: Springer.
  • Batty, M., & Taylor, M. J. (2003). Early processing of the six basic facial emotional expressions. Cognition Brain Researcher, 17(3), 613–620.
  • Borchert, M., & Düsterhöft, A. (2005). Emotions in speech – Experiments with prosody and quality features in speech for use in categorical and dimensional emotion recognition envir- onments. Proceedings of 2005 IEEE international conference on natural language processing and knowledge engineering, IEEE NLP-KE’05, 2005 (pp. 147–151). Wuhan, China.
  • Busso, C., Deng, Z., Yildirim, S., Bulut, M., Lee, C. M., Kazemzadeh, A., … Narayanan, S. (2004). Analysis of emotion recognition using facial expressions, speech and multimodal information. Proceedingd of the 6th Internatins Confernrvv Multimodal interfaces -ICMI ’04 (p. 205).
  • Calvo, M. G., & Lundqvist, D. (2008, February). Facial eExpressions of emotion (KDEF): Identification under different display-duration conditions. Behavioral Researcher Methods, 40(1), 109–115.
  • Castellano, G., Kessous, L., & Caridakis, G. (2007, September 19–21).Multimodal emotion recognition from expressive faces, body gestures and speech. Artificial Intelligence and Innovations 2007: From theory to applications, proceedings of the 4th IFIP international conference on artificial intelligence applications and innovations (AIAI 2007), Peania, Athens, Greece.
  • Courgeon, M., & Clavel, C. (2013). MARC: A framework that features emotion models for facial animation during human–computer interaction. Journal Multimodal User Interfaces, 7(4), 311–319.
  • Creed, C., & Beale, R. (2006). Multiple and extended interactions with affective embodied agents. Proceedings of the 2006 workshop on the role of emotion in HCI, Stuttgart, Fraunhofer IRB Verlag.
  • De Gelder, B., & Vroomen, J. (2000, May). The perception of emotions by ear and by eye. Cognition Emotion, 14(3), 289–311.
  • Deethardt, J. F., & Hines, D. G. (1983). Tactile communication differences and personality. Journal of Nonverbal Behavior, 8(2), 143. doi:10.1007/BF00987000
  • Dolan, R. J. (2002). Emotion, cognition, and behavior. Science, 298(5596), 1191–1194.
  • Eimer, M., & Holmes, A. (2007). Event-related brain potential correlates of emotional face processing. Neuropsychologia, 45(1), 15–31.
  • Ekman, P., & Friesen, W. V. (1971). Constants across cultures in the face and emotion. Journal Pers Social Psychologist, 17(2), 124.
  • Ekman, P., & Friesen, W. V. (1975a). Unmasking the face: A guide to recognizing emotions from facial clues. Oxford: Prentice-Hall.
  • Ekman, P., & Friesen, W. V. (1975b). Unmasking the face: A guide to recognizing emotions from facial clues, no. 1968. Oxford, England: Prentice-Hall.
  • El Ayadi, M., Kamel, M. S., & Karray, F. (2011). Survey on speech emotion recognition: Features, classification schemes, and databases. Pattern Recognition, 44, 572–587.
  • Ellingsen, D. M., Wessberg, J., Chelnokova, O., Olausson, H., Laeng, B., & Leknes, S. (2014). In touch with your emotions: Oxytocin and touch change social impressions while others’ facial expressions can alter touch. Psychoneuroendocrinology, 39(1), 11–20.
  • Esau, N., Wetzel, E., Kleinjohann, L., & Kleinjohann, B. (2007). Realtime facial expression recognition using a fuzzy emotion model. 2007 IEEE international fuzzy systems conference (pp. 1–6).
  • Esposito, A. (2009). Affect in multimodal information. In J. Tao & T. Tan (Eds.), Affective information processing (pp. 203–226). London: Springer.
  • Essick, G. K., James, A., & McGlone, F. P. (1999). Psychophysical assessment of the affective components of non-painful touch. In NeuroReport: For rapid communication of neuroscience research (Vol. 10, pp. 2083–2087). US: Lippincott Williams & Wilkins.
  • Essick, G. K., McGlone, F., Dancer, C., Fabricant, D., Ragin, Y., Phillips, N., … Guest, S. (2010, Feb). Quantitative assessment of pleasant touch. Neuroscience Biobehav Reviews, 34(2), 192–203.
  • Fragopanagos, N., & Taylor, J. G. (2005, May). Emotion recognition in human-computer interaction. Neural Network, 18(4), 389–405.
  • Gaffary, Y., Eyharabide, V., Martin, J.-C., & Ammi, M. (2014). The impact of combining kinesthetic and facial expression displays on emotion recognition by users. International Journal Human Computation Interact, 30(11), 904–920.
  • Gao, Y., Bianchi-Berthouze, N., & Meng, H. (2012). What does touch tell us about emotions in touchscreen-based gameplay? ACM Transactions Computation Interact, 19(4), 30. 31:1–31.
  • Ghandi, B. M., Nagarajan, R., & Desa, H. (2010). Real-time system for facial emotion detection using GPSO algorithm. ISIEA 2010-2010. IEEE symposium on industrial electronics and applications (pp. 40–45).
  • Goldman, A. I., & Sripada, C. S. (2005). Simulationist models of facebased emotion recognition. Cognition, 94(3), 193–213.
  • Gordon, I., Voos, A. C., Bennett, R. H., Bolling, D. Z., Pelphrey, K. A., & Kaiser, M. D. (2013, Aprilil). Brain mechanisms for processing affective touch. Human Brain Mapping, 34(4), 914–922.
  • Grubb, C. (2013). Multimodal emotion recognition. Technical Report. Retrieved from http://orzo.union.edu/Archives/SeniorProjects/2013/CS.2013/
  • Gunes, H., Piccardi, M., & Jan, T. (2007). Face and body gesture recognition for a vision-based multimodal analyzer. Proceedings of the Pan-Sydney area Workshop on Visual Information Processing, 36, 19–28. June 1, 2004.
  • Haxby, J. V. J., Hoffman, E. E. A., & Gobbini, M. I. M. M. I. (2000). The distributed human neural system for face perception. Trends Cognition Sciences, 4(6), 223–233.
  • Helle, P., & Perona, C. (2015). Pasadena houses 2000 [Photographs].
  • Henley, N. M. (1973). Status and sex: Some touching observations. Bulletin of the Psychonomic Society, 2(2), 91–93. Psychonomic Society, Inc., US.
  • Hertenstein, M. J., Holmes, R., McCullough, M., & Keltner, D. (2009, August). The communication of emotion via touch. Emotion, 9(4), 566–573.
  • Hertenstein, M. J., Keltner, D., App, B., Bulleit, B. A., & Jaskolka, A. R. (2006a, August). Touch communicates distinct emotions. Emotion, 6(3), 528–533.
  • Hertenstein, M. J., Verkamp, J. M., Kerestes, A. M., & Holmes, R. M. (2006b, February). The communicative functions of touch in humans, nonhu-man primates, and rats: A review and synthesis of the empirical research. Genetics Social General Psychologist Monograph, 132(1), 5–94.
  • Hoffman, E. A., & Haxby, J. V. (2000, January). Distinct representations of eye gaze and identity in the distributed human neural system for face perception. Nature Neuroscience, 3(1), 80–84.
  • Huisman, G., & Frederiks, A. D. (2013). Towards tactile expressions of emotion through mediated touch. CHI’13 Extended Abstracts on Human Factors in Computing Systems, 1575–1580.
  • Hussey, E., & Safford, A. (2009, Jan). Perception of facial expression in somatosensory cortex supports simulationist models. Journal Neuroscience, 29(2), 301–302.
  • Izard, C. E. (1971). The face of emotion/Carroll E. In Izard. New York: Appleton-Century-Crofts.
  • John, O., & Srivastava, S. (1999). The big five trait taxonomy: History, measurement, and theoretical perspectives. In L. A. Pervin & O. P. John (Eds.), Handbook of personality: theory and research (Vol. 2, pp. 102–138). New York: Guilford Press.
  • Karpouzis, K., Caridakis, G., Kessous, L., Amir, N., Raouzaiou, A., Malatesta, L., & Kollias, S. (2007). Modeling naturalistic affective states via facial, vocal, and bodily expressions recognition. Proceeding ICMI’06/IJCAI’07 Proceedings of the ICMI 2006 and IJCAI 2007 international conference on Artifical intelligence for human computing (pp. 91–112).
  • Keltner, D., & Kring, A. M. (1998). Emotion, social function, and psychopathology. Reviews General Psychologist, 2(3), 320.
  • Keysers, C., Kaas, J. H., & Gazzola, V. (2010). Somatosensation in social perception. Nature Reviews Neuroscience, 11(6), 417–428.
  • Kim, S., Georgiou, P. G., Lee, S., & Narayanan, S. (2007). Real-time emotion detection system using speech: Multi-modal fusion of different time-scale features. 2007 IEEE 9Th international workshop on multimedia signal processing, MMSP 2007 - proceedings (pp. 48–51).
  • Klimmt, C., & Hartmann, T. (2008). Mediated interpersonal communication in multiplayer video games: Implications for entertainment and relationship management. In E. A. Konijn, S. Utz, M. Tanis, & S. B. Barnes (Eds.), Mediated interpersonal communication (pp. 309–330). New York: Routledge
  • Knapp, M. L., & Hall, J. A. (2010). Nonverbal communication in human interaction (8th ed.). Cengage Learning.
  • Lewis, M., Haviland-Jones, J. M., & Barrett, L. F. (Eds.). (2010). In Handbook of emotions (3rd ed.). New York, NY: Guilford Press.
  • Loken, L. S., Wessberg, J., Morrison, I., McGlone, F., & Olausson, H. (2009, May). Coding of pleasant touch by unmyelinated afferents in humans. Nature Neuroscience, 12(5), 547–548.
  • Machot, F. A., Mosa, A. H., Dabbour, K., Fasih, A., Schwarzlmüller, C., Ali, M., & Kyamakya, K. (2011) A novel real-time emotion detection system from audio streams based on Bayesian quadratic discriminate classi-fier for ADAS. Proceedings of the joint 3rd international workshop on nonlinear dynamics and synchronization, INDS’11 and 16th international symposium on theoretical electrical engineering, ISTET’11 (pp. 47–51).
  • Martin, W., Metallinou, A., Eyben, F., & Narayanan, S. (2010, September 26–30).Context-sensitive multimodal emotion recognition from speech and facial expression using bidirectional LSTM modeling. INTERSPEECH 2010, 11th annual conference of the international speech communication association (pp. 2362–2365). Makuhari, Chiba, Japan.
  • Matsumoto, D. (1992). More evidence for the universality of a contempt expression. Motivatin and Emotion, 16(4), 363–368.
  • McGlone, F., Vallbo, A. B., Olausson, H., Loken, L., & Wessberg, J. (2007). Discriminative touch and emotional touch. Canada Journal Experiments Psychologist Canada Psychologist Expérimentale, 61(3), 173–183.
  • McPartland, J., Cheung, C. H. M., Perszyk, D., & Mayes, L. C. (2010, October). Face-related ERPs are modulated by point of gaze. Neuropsychologia, 48(12), 3657–3660.
  • Montoya, P., & Sitges, C. (2006, January). Affective modulation of somatosensory-evoked potentials elicited by tactile stimulation. Brain Research, 1068(1), 205–212.
  • Niewiadomski, R., & Pelachaud, C. (2007). Model of facial expressions management for an embodied conversational agent. In A. R. Paiva, R. Prada, & R. Picard (Eds.), Affective computing and intelligent interaction SE– 2 (Vol. 4738, pp. 12–23). Berlin Heidelberg: Springer.
  • Nwe, T. L., Foo, S. W., & De Silva, L. C. (2003). Speech emotion recognition using hidden Markov models. Speech Communicable, 41, 603–623.
  • Olausson, H., Wessberg, J., Morrison, I., McGlone, F., & Vallbo, A. (2010, February). The neurophysiology of unmyelinated tactile afferents. Neuroscience Biobehave Reviews, 34(2), 185–191.
  • Ozawa, S., Mieda, S., Date, M., Takada, H., & Kojima, A. (2014). MulDiRoH: An evaluation of facial direction expression in teleconferencing on a multi-view display system. In S. Yamamoto (Ed.), Human interface and the management of information. information and knowledge design and evaluation SE – 52 (Vol. 8521, pp. 525–535). Switzerland: Springer International Publishing.
  • Pantic, M., & Rothkrantz, L. J. M. (2003, September). Toward an affectsensitive multimodal human-computer interaction. Proceedings of IEEE, 91(9), 1370–1390.
  • Park, Y.-W., Baek, K.-M., & Nam, T.-J. (2013). The roles of touch during phone conversations: Long-distance couples’ use of POKE in their homes. Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1679–1688). Paris, France.
  • Partala, T., & Surakka, V. (2004, April). The effects of affective interventions in human–Computer interaction. Interact with Computation, 16(2), 295–309.
  • Pasquariello, S., & Pelachaud, C. (2002). Greta: A simple facial animation engine. In Soft computing and industry - Recent applications (pp. 511–525). London: Springer.
  • Patel, A. T. (2013). The Handbook of Touch: Neuroscience, behavioral, and health perspectives. American Journal of Physical Medicine & Rehabilitation, 92(10), 945.
  • Picard, R. W. (2000). Affective computing. Cambridge, MA: MIT Press.
  • Pierre-Yves, O. (2003). The production and recognition of emotions in speech: Features and algorithms. International Journal Human Computation Studies, 59, 157–183.
  • Planalp, S. (1993). Communication, cognition, and emotion. Communication Monographs, 60, 3–9.
  • Pourtois, G. (2004). Electrophysiological correlates of rapid spatial orienting towards fearful faces. Cerebral Cortex (New York, NY: 1991), 14(6), 619–633.
  • Pourtois, G., Sander, D., Andres, M., Grandjean, D., Reveret, L., Olivier, E., & Vuilleumier, P. (2004). Dissociable roles of the human somato-sensory and superior temporal cortices for processing social face signals. The European Journal of Neuroscience, 20(12), 3507–3515.
  • Ravaja, N., Harjunen, V., Ahmed, I., Jacucci, G., & Spapé, M. M. (2017). Feeling touched: Emotional modulation of somatosensory potentials to interpersonal touch. Scientific Reports, 7(40504). doi:10.1038/srep40504
  • Rossion, B., Caldara, R., Seghier, M., Schuller, A. M., Lazeyras, F., & Mayer, E. (2003). A network of occipito-temporal face-sensitive areas besides the right middle fusiform gyrus is necessary for normal face processing. Brain, 126(11), 2381–2395.
  • Saldien, J., Goris, K., Yilmazyildiz, S., Verhelst, W., & Lefeber, D. (2008). On the design of the huggable robot probo. Journal Physical Agents, 2(2), 3–11.
  • Schirmer, A., Teh, K. S., Wang, S., Vijayakumar, R., Ching, A., Nithianantham, D., … Cheok, A. D. (2011, January). Squeeze me, but don’t tease me: Human and mechanical touch enhance visual attention and emotion discrimination. Social Neuroscience, 6(3), 219–230.
  • Sel, A., Forster, B., & Calvo-Merino, B. (2014). The emotional homunculus: ERP evidence for independent somatosensory responses during facial emotional processing. The Journal of Neuroscience: the Official Journal of the Society for Neuroscience, 34(9), 3263–3267.
  • Spapé, M. M., Harjunen, V., & Ravaja, N. (2017). Effects of touch on emotional face processing: A study of event-related potentials, facial EMG and cardiac activity. Biologic Psychologist, 124, 1–10.
  • Stiehl, W. D., & Breazeal, C. (2005). Affective touch for robotic companions. In J. Tao, T. Tan, & R. W. Picard (Eds.), Affective computing and intelligent interaction. ACII 2005. Lecture Notes in Computer Science (Vol. 3784). Berlin, Heidelberg: Springer.
  • Streltsova, A., & McCleery, J. P. (2014). Neural time-course of the observation of human and non-human object touch. Social Cognition Affect Neurosci., 9(3), 333–341.
  • Tao, J., Kang, Y., & Li, A. (2006). Prosody conversion from neutral speech to emotional speech. IEEE Transactions Audio, Speech Language Processing, 14, 1145–1153.
  • Theune, M., Meijs, K., Heylen, D., & Ordelman, R. (2006). Generating expressive speech for storytelling applications. IEEE Transactions Audio, Speech Language Processing, 14, 1137–1144.
  • Tsalamlal, M., Ouarti, N., Martin, J.-C., & Ammi, M. (2014a). Haptic communication of dimensions of emotions using air jet based tactile stimulation. Journal Multimodal User Interfaces, 9(1), 1–2.
  • Tsalamlal, M. Y., Issartel, P., Ouarti, N., & Ammi, M. (2014b). HAIR: HAptic feedback with a mobile AIR jet. Robotics and Automation (ICRA), 2014 ieee international conference (pp. 2699–2706). Hong Kong.
  • Tsalamlal, M. Y., Ouarti, N., Martin, J.-C., & Ammi, M. (2013). EmotionAir: Perception of emotions from air jet based tactile stimula-tion. Affective Computing and Intelligent Interaction (ACII), 2013 humaine association conference (pp. 215–220). Geneva, Switzerland.
  • Tsetserukou, D., & Neviarouskaya, A. (2010). World’s first wearable humanoid robot that augments our emotions. Proceedings on 1st augmented human international conference – AH ’10, (pp. 1–10). Megève, France, Springer.
  • Vogt, T., André, E., & Wagner, J. (2008). Automatic recognition of emotions from speech: A review of the literature and recommendations for practical realisation. In Affect and emotion in human-computer interaction, (pp. 75–91) Springer, Berlin, Heidelberg.
  • Wessberg, J., Olausson, H. A., Fernström, K. W., & Vallbo, A. B. (2003). Receptive field properties of unmyelinated tactile afferents in the human skin. Journal of Neurophysiology, 89(3), 1567–1575.
  • Xie, L., Deng, Z., & Cox, S. (2013, November). Multimodal joint information processing in human machine interaction: Recent advances. Multimed Tools Applications, 73(1), 267–271.
  • Xue, Y.-L., Mao, X., Li, Z., & Diao, W.-H. (2007). Modeling of layered fuzzy facial expression generation. In V. Duffy (Ed.), Digital human modeling SE – 29 (Vol. 4561, pp. 243–252). Berlin, Heidelberg: Springer.
  • Yohanan, S. J. (2012). The Haptic Creature: Social human-robot interaction through affective touch (T). University of British Columbia. Retrieved from https://open.library.ubc.ca/cIRcle/collections/24/items/1.0052135
  • Zeng, Z., Hu, Y., Roisman, G. I., Wen, Z., Fu, Y., & Huang, T. S. (2007). Audio-visual spontaneous emotion recognition. In T. S. Huang, A. Nijholt, M. Pantic, & A. Pentland (Eds.), Artifical intelligence for human computing. Lecture notes in computer science (Vol. 4451). Berlin, Heidelberg: Springer.
  • Ziat, M., & Raisamo, R. (2017). The cutaneous-rabbit illusion: What if it is not a rabbit? IEEE World Haptics Conference (WHC) (pp. 540–545). Munich.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.