Abstract
Learning in humans and animals is accompanied by a penumbra: Learning one task benefits from learning an unrelated task shortly before or after. At the cellular level, the penumbra of learning appears when weak potentiation of one synapse is amplified by strong potentiation of another synapse on the same neuron during a critical time window. Weak potentiation sets a molecular tag that enables the synapse to capture plasticity-related proteins synthesized in response to strong potentiation at another synapse. This paper describes a computational model which formalizes synaptic tagging and capture in terms of statistical learning mech- anisms. According to this model, synaptic strength encodes a probabilistic inference about the dynamically changing association between pre- and post-synaptic firing rates. The rate of change is itself inferred, coupling together different synapses on the same neuron. When the inputs to one synapse change rapidly, the inferred rate of change increases, amplifying learning at other synapses.
Notes
Notice of correction:
Redundant text has been removed from the opening paragraph of this article.
Notes
[1] The weight dynamics in their current form violate Dale's Law (Eccles, Citation1964), which in its abstract form states that a synapse can only be either excitatory or inhibitory. If we wish to make the dynamics more biologically plausible, we could introduce a rectification that prevents the weight from changing sign. This modification is not explored here.
[2] We thank an anonymous reviewer for pointing this out.