References
- Hornik K, Stinchcombe M, White H. Multilayer feedforward networks are universal approximators. Neural Networks 1989; 2: 359–366
- Weigland A, Rumelhart D, Huberman B. Back-propagation, weight-elimination, and time series prediction. Proc. 1990, Connectionist Models Summer School (University of California at San Diego, CA), San Mateo, CA. Morgan Kaufmann, 1990; 105–116
- Moody J, Darken C. Learning with localised receptive fields. Proc. 1988, Connectionist Models Summer School. Carnegie Mellon University, MA 1988; 38–51, San Mateo, CA Morgan Kaufmann
- Hanson S, Pratt L. Comparing biases for minimal network construction with back propagation. Advances in Neural Information Processing Systems. Morgan Kaufmann, San Mateo, CA 1989; 1: 177–185
- Abe S Kayama M, Takenaga H. How neural networks for pattern recognition can be synthesised. J. Inf. Process. 1991; 14(3), at press
- Woodland P. Weight limiting, weight quantisation, and generalisation in multilayer perceptrons. Proc. 1989, IEEE Conf. on Artificial Neural Networks. IEEE, Piscataway, NY 1989; 297–300
- Sietsma J, Dow R. Neural net pruning-why and how. Proc. IEEE Int. Conf. on Neural Networks (San Diego 1988), Piscataway, NJ. IEEE, 1988; 325–333
- Mozer M, Smolensky P. Skeletonisation, a technique for trimming the fat from a network via relevance assessment. Advances in Neural Information Processing Systems. Morgan Kaufmann, San Mateo 1989; 1: 107–115
- Kayama M, Abe S, Takenaga H, Morooka Y. Constructing optimal neural networks by linear regression analysis. Proc. Neuro-Nimes '90. 1990; 363–378
- Rumelhart D, McClelland J. Parallel Distributed Processing: Explorations in the Microstructure of Cognition. MIT Press, Cambridge, MA 1986; 1: 322–8