ABSTRACT
Language processing always involves a combination of sensory (auditory or visual) and motor modalities (vocal or manual). In line with embodied cognition theories, we additionally assume a semantically implied modality (SIM) due to modality references of the underlying concept. Understanding ear-related words (e.g. “noise”), for example, should activate the auditory SIM. In the present study, we investigated the influence of the SIM on sensory-motor modality switching (e.g. switching between the auditory-vocal and visual-manual combination). During modality switching, participants categorised words with regard to their SIM (e.g. ear- versus eye-related words). Overall performance was improved and switch costs were reduced whenever there was concordance between SIMs and sensory-motor modalities (e.g. an auditory presentation of ear-related words). Thus, the present study provides first evidence for semantic effects during sensory-motor modality switching in terms of facilitation effects whenever the SIM was in concordance with sensory-motor modalities.
Acknowledgement
We thank Diane Pecher and the anonymous reviewers for helpful comments on an earlier version of this paper.
Disclosure statement
No potential conflict of interest was reported by the authors.
Notes
1. Note that modality compatibility actually represents a continuum rather than a “pure” dichotomy, so that “incompatible” is relative and means “less compatible”.
2. Previous studies about semantic modality switching in language processing (Pecher et al., Citation2003) revealed semantic modality switch costs. To replicate this finding, we conducted a further analysis of our data, in which we focused on semantic modality switch costs (e.g. processing an ear-related word after processing an eye-related word) instead of sensory-motor modality switch costs. In this analysis, we only considered trials including the visual-manual sensory-motor modality combination (in the current trial and also in the trial before; that is, repetition trials with regard to sensory-motor modality combinations), because it was best comparable to the constantly used visual-manual combination in the investigations by Pecher et al. (Citation2003). However, this analysis revealed no significant difference between semantic modality switch and repetition trials (t < 1). This was probably due to a strong influence of switching between sensory-motor modalities, which were constant in the study by Pecher et al. (Citation2003), as well as due to considerable differences between both studies with regard to the experimental design.
3. However, it is important to note that this interaction does not provide information about the general semantic effect in both groups. There was a significant main effect of semantics in the RT data analysis and the interaction between semantic effect and group was not significant.
4. This interaction was not confirmed in a similar experiment in our lab, in which the semantic categorisation did not focus attention to the modalities. Participants were instructed to categorise words into modality-unrelated categories (i.e. into the categories “concrete object” vs. “abstract concept”) – instead of the categorisation with regard to the SIM – during switching between different sensory-motor modality combinations. RT data revealed a marginally significant semantic effect (F(1, 23) = 3.3; p = .085; ) as well as a marginally significant interaction between modality transition and semantic effect (F(1, 23) = 3.3; p = .085;
). Thus, semantic effects of Experiment 1 were further confirmed in this experiment (although they were diminished to some extent – probably due to the altered semantic categorisation). However, the interaction between modality compatibility and semantic effects found in the RT data of Experiment 1was not significant in that experiment (F < 1).