242
Views
4
CrossRef citations to date
0
Altmetric
Articles

Affective Communication through Air Jet Stimulation: Evidence from Event-Related Potentials

, , , &
 

ABSTRACT

In this article, effectiveness of air jet stimulation in mediated emotional communication was investigated by assessing cross-modal influences of visual emotional expressions on tactile perception. Brain responses to combined visual faces and air jet stimuli were measured using event-related potentials; whereas, emotional responses were assessed using self-reported pleasantness of the tactile stimulation. ERP results reveal significant differences between the different facial expressions for the same tactile air-jet intensity in the somatosensory area. Moreover, participants’ pleasantness ratings suggest an effect of the visual stimulus on the difference tactile conditions that correspond to air jet stimulation intensities: low, medium, and high. These promising results provide evidence in the potential efficiency of this stimulation technique in activating skin receptors that play an important role in social and affective behaviors.

Acknowledgments

This research has been partially supported by the Tactile Committee for Haptics (TCH) for Student Exchange Program for Cross-Disciplinary Fertilization.

Additional information

Funding

This work was supported by the IEEE Technical Committee on Haptics (TCH) [TCH Student Exchange Program for Cross-Disciplinary Fertilization (2013)].

Notes on contributors

Mohamed Yacine Tsalamlal

Mohamed Yacine Tsalamlal received the master degree in human-machine systems engineering from Université de Lorraine, France. He received the PhD degree in computer sciences form the Université Paris-Saclay. He is a postdoctoral researcher with Architectures and Models for Interaction Group at LIMSI-CNRS Lab. His research interests are about the study of tactileinteraction techniques for mediated social communication. He exploits multiple approaches (mechanics, robotics, experimental phycology) to help in the design of efficient multimodal affective interaction systems.

Jean-Claude Martin

Jean-Claude Martin received the PhD degree in computer science from Télecom Paris, France, in 1999. He is full professor with the Université Paris-Sud (UPSUD) / LIMSI-CNRS, Orsay, France. He received a Habilitation to Direct Research in Computer Science, in 2006. He is head of the Cognition Perception Use group, LIMSI. He conducts pluridisciplinary research on nonverbal affective interaction, including theoretical and experimental studies about human cognition and computational modeling of multimodal affective processes in virtual agents and robots. He is the editor in chief of the Springer Journal on Multimodal User Interfaces.

Mehdi Ammi

Mehdi Ammi is an associate professor with the Université Paris-Sud (UPSUD) / LIMSI-CNRS, Orsay, France, and he specializes in haptics for virtual reality and tele-operation. More specifically, he is interested in all aspects of haptic processes, ranging from physiological mechanisms to the search for operational methodologies that are designed to integrate the haptic modality in different types of applications.

Mounia Ziat

Mounia Ziat earned an Engineer’s degree in Electronics, and a M.S. and a Ph.D in Cognitive Sciences. Her research interests include haptic device design and HCI, human perception, multisensory integration, and cognitive neuroscience. Mounia Ziat is an associate professor at Northern Michigan University and is currently on a sabbatical leave at UCLA.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.