44
Views
45
CrossRef citations to date
0
Altmetric
Original Article

Learning internal representations in an attractor neural network with analogue neurons

&
Pages 359-388 | Received 07 Nov 1994, Published online: 09 Jul 2009
 

Abstract

A learning attractor neural network (LANN) with a double dynamics of neural activities and synaptic efficacies, operating on two different timescales is studied by simulations in preparation for an electronic implementation, The present network includes several quasi-realistic features: neurons are represented by their afferent currents and output spike rates; excitatory and inhibitory neurons are separated; attractor spike rates as well as coding levels in arriving stimuli are low; learning takes place only between excitatory units. Synaptic dynamics is an unsupervised, analogue Hebbian process, but long term memory in the absence of neural activity is maintained by a refresh mechanism which on long timescales discretizes the synaptic values, converting learning into asynchronous stochastic process induced by the stimuli on the synaptic efficacies.

This network is intended to lean a set of amactors from the statistics of freely arriving stimuli, which are represented by extemal synaptic inputs injected into the excitatory neurons. In the simulations different types of sequences of many thousands of stimuli are presented to Ihe network, without distinguishing in the dynamics a leaming phase from retrieval. Stimulus sequences differ in pre-assigned global statistics (including time-dependent statistics); in orders of presentation of individual stimuli within a given statistics: in lengths of time intervals for each presentation and in the intervals separating one stimulus from another.

We find that the network effectively learns a set of attractors representing the statistics of the stimuli. and is able to modify its amactors when the input staristics change. Moreover, as the global input statistics changes the network can also forget attracton related to stimulus classes no longer presented. Forgetting takes place only due to the arrival of new stimuli. The performance of the network and the statistics of he amactors are studied as a function of the inmt statistics. Mast of the large-scale characteristics of the leamina dynamics can be captured theoretically.

This model modifies a previous implementation of a LANN composed of discrete neurons. in a nerwork of more redistic neurons. The different elements have been designed to facilitate their implementation in silicon.†On leave of absence from Racah Institute of Physics, Jerusalem, Israel.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.