13
Views
2
CrossRef citations to date
0
Altmetric
Original Article

Internal representation with minimum entropy in recurrent neural networks: minimizing entropy through inhibitory connections

Pages 423-440 | Received 12 Dec 1992, Published online: 09 Jul 2009

References

  • Almeida L. A learning rule for asynchronous perceptrons with feedback in a combinatorial environment. Proc. IEEE First Int. Conf. on Neural Networks, San Diego, CA. IEEE, Piscataway, NJ 1987; 2: 609–18
  • Barlow H B, Kaushal T P, Mitchison G J. Finding minimum entropy codes. Neural Comput. 1989; 1: 412–23
  • Chauvin Y. A back-propagation algorithm with optimal use of hidden units. Advances in Neural Information Processing Systems, D S Touretzky. Morgan Kaufmann, San Mateo, CA 1989; 1: 519–26
  • Chauvin Y. Dynamic behavior of constrained back-propagation networks. Advances in Neural Information Processing Systems, D S Touretzky. Morgan Kaufmann, San Mateo, CA 1990; 2: 642–9
  • French R M. Semi-distributed representations and catastrophic forgetting in connectionist networks. Connect. Sci. 1992; 4: 365–77
  • Gorman R P, Sejnowski T J. Analysis of hidden units in a layered network trained to classify sonar targets. Neural Networks 1988; 1: 75–89
  • Hanson S J, Pratt L Y. Comparing biases for minimal network construction with back-propagation. Advances in Neural Information Processing Systems, D S Touretzky. Morgan Kaufmann, San Mateo, CA 1989; 1: 177–85
  • Hertz J, Krough A, Palmer R G. Introduction to the Theory of Neural Computation. Addison-Wesley, Redwood City, CA 1991
  • Hinton G E, McClelland J L, Rumelhart D E. Distributed representation. Parallel Distributed Processing, D E Rumelhart, J L McClelland. MIT Press, Cambridge, MA 1986; 1: 77–109, and the PDP Research Group
  • Kamimura R. Application of the recurrent neural network to the problem of language acquisition. Proc. Conf. on Analysis of Neural Network Applications. Fairfax, VA 1991; 14–28
  • Kamimura R. Acquisition of the grammatical competence with recurrent neural networks. Artificial Neural Networks, T Kohonen, K Makisara, O Simula, J Kangas. North-Holland, Amsterdam 1991; 1: 903–8
  • Mozer M C, Smolensky P. Skeletonization: a technique for trimming the fat from a network via relevance assessment. Advances in Neural Information Processing Systems, D S Touretzky. Morgan Kaufmann, San Mateo, CA 1989; 1: 107–15
  • Omlin C W, Giles C L. Pruning recurrent neural networks for improved generalization performance. Revised Technical Report. Computer Science Department, Rensselaer Polytechnic Institute, Troy, NY 1993; 93-6
  • Pineda F J. Generalization of back-propagation to recurrent neural networks. Phys. Rev. Lett. 1987; 59: 2229–32
  • Pineda F J. Dynamics and architecture for neural computation. J. Complexity 1988; 4: 216–45
  • Rumelhart D E, Hinton G E, Williams R J. Learning internal representation by error propagation. Parallel Distributed Processing, D E Rumelhart, J L McClelland. MIT Press, Cambridge, MA 1986; 1: 318–62, and the PDP. Research Group
  • Simard P Y, Ottaway M B, Ballard D H. Analysis of recurrent back-propagation. Proc. 1988 Connectionist Models Summer School, D S Touretzky, G Hinton, T Sejnowski. Morgan Kaufmann, San Mateo, CA 1988; 103–12
  • Shannon C, Weaver W. The Mathematical Theory of Communication. University of Illinois Press, Urbana, IL 1949
  • Thodberg H H. Improving generalization of neural networks through pruning. Int. J. Neural Syst. 1991; 1: 317–26
  • Weigend A S, Huberman B A, Rumelhart D E. Predicting the future: a connectionist approach. Int. J. Neural Syst. 1990; 1: 193–209
  • Weigend A S, Rumelhart D E, Huberman B A. Generalization by weight-elimination with application to forecasting. Advances in Neural Information Processing Systems, D S Touretzky. Morgan Kaufman, San Mateo, CA 1991; 3: 875–82

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.