457
Views
10
CrossRef citations to date
0
Altmetric
Article Addendum

What can crossmodal aftereffects reveal about neural representation and dynamics?

&
Pages 479-481 | Received 24 Jun 2009, Accepted 24 Jun 2009, Published online: 01 Nov 2009

Abstract

The brain continuously adapts to incoming sensory stimuli, which can lead to perceptual illusions in the form of aftereffects. Recently we demonstrated that motion aftereffects transfer between vision and touch.Citation1 Here, the adapted brain state induced by one modality has consequences for processes in another modality, implying that somewhere in the processing stream, visual and tactile motion have shared underlying neural representations. We propose the adaptive processing hypothesis—any area that processes a stimulus adapts to the features of the stimulus it represents, and this adaptation has consequences for perception. This view argues that there is no single locus of an aftereffect. Rather, aftereffects emerge when the test stimulus used to probe the effect of adaptation requires processing of a given type. The illusion will reflect the properties of the brain area(s) that support that specific level of representation. We further suggest that many cortical areas are more process-dependent than modality-dependent, with crossmodal interactions reflecting shared processing demands in even ‘early’ sensory cortices.

This article refers to:

Introduction

Aftereffects are a powerful behavioral paradigm used to infer how information is represented in the brain and how neural populations and circuits change over time – neural dynamics. In the case of motion aftereffect paradigms, for example, an observer stares at visual motion such as a drifting grating (the adapting stimulus) for a period of seconds. When this visual stimulus is suddenly changed to a static visual grating (the test stimulus), the observer sees this stationary stimulus as if it were moving opposite the direction of the original motion for a short period of time.Citation2,Citation3 From this simple behavioral paradigm, we gain critical insights into both underlying neural representation and neural dynamics. First, visual motion perception relies on competing representations in opponent directions. Second, extended processing of a stimulus leads to changes in the brain, which we refer to as the adapted brain state. Where in the brain are circuits changing during adaptation? In other words, what is the site of these neural dynamics?

One intuitive answer is that visual motion aftereffects arise from local dynamics in visual cortex. Motion aftereffects also exist in the auditoryCitation4 and tactile domains,Citation5 suggesting that such neural dynamics are a general property of cortico-cortical or thalamo-cortical circuits.Citation6 However, we recently demonstrated that adaptation to tactile motion can lead to visual motion aftereffects, and visa versa.Citation1 Further, visual motion adaptation leads to auditory motion aftereffects.Citation7,Citation8 As such, these crossmodal aftereffects challenge the simple explanation that motion aftereffects arise from unisensory cortex alone. What properties about adaptation and representation are needed to explain how crossmodal aftereffects occur?

Adaptive Processing Hypothesis

Aftereffects reveal that extended processing of incoming sensory information changes the brain, and there are measurable consequences of this change in subsequent perception. What is the site of these neural dynamics? A naïve view is that dynamics are expressed only in a final integration stage, for example a single ‘higher’ order cortical area, where modulatory flexibility is inherent to its function. In the classic view, the explicit computational goal of this area is ‘bimodal’ integration. Cross-modal aftereffects would emerge, then, as the product of dynamics at this convergent center.

In contrast, we propose that any area or circuit that processes a stimulus is changed by that stimulus and that these dynamics are a functional property of areas throughout the system—the adaptive processing hypothesis. For example, motion-responsive neurons are found in many places short of the “motion processing area” MT, including V1, V2, and V3, and are also found in parietal areas. Thus, motion aftereffects likely originate not from adaptation in one area or circuit but from many stages of processing both in early sensory areas and in higher level areas.Citation3

A corollary of the adaptive processing hypothesis is that at each level of processing, different aspects of the incoming stimulus are adapting, reflecting the underlying dimensions represented by those neural populations. For example, V1 responses reflect orientation, scale, and motion properties at a specific location, with increasing receptive field size and tuning properties in V2, V3, and MT. This view implies that different aftereffects might be observed across retinotopic locations based on the relative contribution of early and later areas in processing the subsequent test stimulus.

Thus, in adaptation paradigms, the subsequent test stimulus can be thought of as a probe of the adapted state. For example, following 10 sec of visual motion adaptation, presenting a static grating leads to retinotopic aftereffects of short duration with low illusory velocity. Following the same adaptation, presenting a dynamic grating instead leads to aftereffects in more spatial locations, which have faster velocity and longer duration.Citation9 Importantly, the same adapted brain state can give rise to several different perceptual aftereffects. Similarly, following adaptation to a face, observers have stronger aftereffects when tested on upright vs. inverted faces, but also show aftereffects when tested with simple T-shaped stimuli.Citation10,Citation11 The critical insight is that the adapted brain state will have consequences on a subsequently presented test stimulus to the extent that that test stimulus depends on processing in those adapted areas. This framework helps explain the well known fact that aftereffects depend on the relationship between the adapting and test stimuli.Citation3

In the case of crossmodal aftereffects, these paradigms simply use one modality to probe the adapted state induced by extended processing in another modality. For example, we recently demonstrated that visual and tactile motion adaptation lead to aftereffects in the other modality.Citation1 Based on the framework outlined above, processing tactile motion depends on circuits that were previously adapted by visual motion processing. Similarly, the processing of visual motion depends on circuits adapted by tactile motion. Crossmodal motion aftereffects reveal that visual and tactile motion perception rely on partially shared neural substrates.

Process-Selective Cortical Circuits

One reason why there might be a site of shared processing between visual and tactile motion comes from an argument for efficient processing. If there is a neural circuit that is specialized to extract motion trajectories from spatio-temporal patterns of spiking input, it might be efficient to route information that requires that processing through that circuit. Indeed, visual motion and tactile motion appear to processed in overlapping (or at least adjacent) areas.Citation12,Citation13

However, this logic does not extend to auditory motion, which does not activate area MTCitation14 (reviewed in ref. Citation15). One possible account for this discrepancy is that visual motion and tactile motion share a similar input pattern, where a grid of sensors in the retina or skin receives spatial information over time. Interestingly, auditory motion information does not arrive by a grid of spatial sensors but by interaural temporal differences, suggesting these stimuli access other brain areas organized to more efficiently perform a different computation.

Several other examples of utilizing specialized processing circuits across modalities exists. For example, TMS studies have shown that fine spatial orientation judgments utilize V1 for visual as well as tactile stimuli.Citation16 Further, fMRI evidence has shown that fine scale orientation judgments in vision and tactile modalities both activate early visual cortex,Citation17 visual and haptic shape activate lateral occipital areas,Citation18 and even haptic exploration of faces is suggested to activate the visual face-selective FFA.Citation19 If these areas are fundamentally contributing to the perception of orientation, shape, and faces in both modalities, then we would predict crossmodal aftereffects will be found. More generally, these data support process-selective cortical circuits, rather than stimulus-selective cortical circuits. Indeed, emerging evidence that the neocortex is more multisensory than previously believed,Citation20,Citation21 also suggests that defining areas by sensory modality might not accurately describe the underlying representation.

Conclusions

Aftereffects reveal that any incoming sensory information leads to changes in neural dynamics. Typically when we sense the world, we sample the continuous stream of input with rapid exploratory patterns. Eyes saccade 3 times per second, and maintaining steady fixation eventually causes the world to turn flat gray. Similarly, skin sweeps over surfaces, and without changing stimulation we cease to notice contact, e.g., with clothing. While active sensing rapidly samples different aspects of the physical world, adaptation paradigms force extended processing of a single aspect of the physical world (for adaptation with brief durations see ref. Citation22). In a sense, this extended processing during adaptation may accentuate the neural mechanisms and perceptual consequences that are continually operating on at a more rapid timescale.

Crossmodal aftereffects provide several insights about these adaptive mechanisms. Specifically, we suggest that adaptation is happening at all neural sites that are involved in processing the stimulus, e.g., by renormalizing competing representations to reflect the incoming sensory information. The test stimulus can be thought of as a probe of this adapted state – the extent of shared substrates in processing determines what aftereffects properties will be observed. Areas may be more process-dependent, rather than stimulus-dependent, with crossmodal interactions following automatically in cases with shared processing demands.

References

  • Konkle T, Wang Q, Hayward V, Moore CI. Motion aftereffects transfer between touch and vision. Curr Biol 2009; 19:1 - 6; http://dx.doi.org/10.1016/j.cub.2009.03.035; PMID: 19135370
  • Wolgemuth A. On the aftereffect of seen movement. Br J Psychol 1911; 1:1 - 117
  • Mather G, Pavan A, Campana G, Casco C. The motion aftereffect reloaded. Trends Cogn Sci 2008; 12:481 - 7; http://dx.doi.org/10.1016/j.tics.2008.09.002; PMID: 18951829
  • Grantham DW, Wightman FL. Auditory motion aftereffects. Percept Psychophys 1979; 26:403 - 8; http://dx.doi.org/10.3758/BF03204166; PMID: 523284
  • Watanabe J, Hayashi S, Kajimoto H, Tachi S, Nishida S. Tactile motion aftereffects produced by appropriate presentation for mechanoreceptors. Exp Brain Res 2007; 180:577 - 82; http://dx.doi.org/10.1007/s00221-007-0979-z; PMID: 17549460
  • Moore CI. Frequency-dependent processing in the vibrissa sensory system. J Neurophysiol 2004; 91:2390 - 9; http://dx.doi.org/10.1152/jn.00925.2003; PMID: 15136599
  • Kitagawa N, Ichihara S. Hearing visual motion in depth. Nature 2002; 416:172 - 4; http://dx.doi.org/10.1038/416172a; PMID: 11894093
  • Jain A, Sally SL, Papathomas TV. Audiovisual short-term influences and aftereffects in motion: examination across three sets of directional pairings. J Vis 2008; 8:7 - , 1-13; http://dx.doi.org/10.1167/8.15.7; PMID: 19146291
  • Nishida S, Sato T. Motion aftereffect with flickering test patterns reveals higher stages of motion processing. Vision Res 1995; 35:477 - 90; http://dx.doi.org/10.1016/0042-6989(94)00144-B; PMID: 7900288
  • Fang F, Ijichi K, He S. Transfer of the face viewpoint aftereffect from adaptation to different and inverted faces. J Vis 2007; 7:6 - , 1-9; http://dx.doi.org/10.1167/7.13.6; PMID: 17997634
  • Susilo T, McKone E, Edwards M. Solving the upside-down puzzle: inverted face aftereffects derive from shape-generic rather than face-specific mechanisms. J Vis 2009; In press
  • Hagen MC, Franzén O, McGlone F, Essick G, Dancer C, Pardo JV. Tactile motion activates the human middle temporal/V5 (MT/V5) complex. Eur J Neurosci 2002; 16:957 - 64; http://dx.doi.org/10.1046/j.1460-9568.2002.02139.x; PMID: 12372032
  • Beauchamp MS, Yasar NE, Kishan N, Ro T. Human MST but not MT responds to tactile stimulation. J Neurosci 2007; 27:8261 - 7; http://dx.doi.org/10.1523/JNEUROSCI.0754-07.2007; PMID: 17670972
  • Lewis JW, Beauchamp MS, DeYoe EA. A comparison of visual and auditory motion processing in human cerebral cortex. Cereb Cortex 2000; 10:873 - 88; http://dx.doi.org/10.1093/cercor/10.9.873; PMID: 10982748
  • Poirier C, Collignon O, Devolder AG, Renier L, Vanlierde A, Tranduy D, et al. Specific activation of the V5 brain area by auditory motion processing: an fMRI study. Brain Res Cogn Brain Res 2005; 25:650 - 8; http://dx.doi.org/10.1016/j.cogbrainres.2005.08.015; PMID: 16298112
  • Zangaladze A, Epstein CM, Grafton ST, Sathian K. Involvement of visual cortex in tactile discrimination of orientation. Nature 1999; 401:587 - 90; http://dx.doi.org/10.1038/44139; PMID: 10524625
  • Sathian K, Zangaladze A, Hoffman JM, Grafton ST. Feeling with the mind’s eye. Neuroreport 1997; 8:3877 - 81; http://dx.doi.org/10.1097/00001756-199712220-00008; PMID: 9462459
  • Amedi A, Malach R, Hendler T, Peled S, Zohary E. Visuo-haptic object-related activation in the ventral visual pathway. Nat Neurosci 2001; 4:324 - 30; http://dx.doi.org/10.1038/85201; PMID: 11224551
  • Kilgour AR, Kitada R, Servos P, James TW, Lederman SJ. Haptic face identification activates ventral occipital and temporal areas: an fMRI study. Brain Cogn 2005; 59:246 - 57; http://dx.doi.org/10.1016/j.bandc.2005.07.004; PMID: 16157435
  • Ghazanfar AA, Schroeder CE. Is neocortex essentially multisensory?. Trends Cogn Sci 2006; 10:278 - 85; http://dx.doi.org/10.1016/j.tics.2006.04.008; PMID: 16713325
  • Driver J, Noesselt T. Multisensory interplay reveals crossmodal influences on ‘sensory-specific’ brain regions, neural responses, and judgments. Neuron 2008; 57:11 - 23; http://dx.doi.org/10.1016/j.neuron.2007.12.013; PMID: 18184561
  • Suzuki S, Cavanagh P. A shape-contrast effect for briefly presented stimuli. J Exp Psychol Hum Percept Perform 1998; 24:1315 - 41; http://dx.doi.org/10.1037/0096-1523.24.5.1315; PMID: 9778826