10
Views
2
CrossRef citations to date
0
Altmetric
Original Article

Nonlinear Hebbian training of the perceptron

&
Pages 619-633 | Received 29 Jun 1995, Published online: 09 Jul 2009
 

Abstract

The effects of nonlinear modulation of the Hebbian learning rule on the performance of a perceptron are investigated. Both random classification and classification provided by a teacher perceptron are considered. It is seen that both the generalization and learning rate depend on the overlap between the teacher and the student and the signal-to-noise ratio in the local field. Furthermore, they are independent of the specific teacher distribution when the ratio between the number of training examples and the perceptron size is small. An analytic expression is obtained for the optimal modulation function for different classification schemes. For random and Gaussian teacher classifications the best choice for modulation appears to be linear. For binary teachers it is shown to be the hyperbolic tangent. The modifications on the latter from diluting the binary teacher are also obtained in analytic form.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.