References
- Ash T. Dynamic Node Creation in Backpropagation Networks. Inst. for Cognitive Science, University of California, San Diego, La Jolla, CA 1989, Tech. Report ICS Report 8901
- Baffes P T, Zelle J M. Growing layers of perceptrons: introducing the extentron algorithm. Proc. 1992 Int. Joint Conf. on Neural Networks, Baltimore, MD. IEEE, Piscataway, NJ 1992; II-392-7
- Beale R, Jackson T. Neural Computing: An Introduction. IOP Publishing, Bristol 1991
- Chandrasekaran B, Goel A, Allemang D. Connectionism and information-processing abstractions. AI Magazine 1988; Winter
- Fahlman S E. Faster-learning variations on back-propagation: an empirical study. Proc. 1988 Connectionist Models Summer School, D S Touretzky, G Hinton, T Sejnowski. Morgan Kaufmann, San Mateo, CA 1988
- Fahlman S E, Lebiere C. The cascade-correlation learning architecture. Advances in Neural Information Processing Systems, D S Touretzky. Morgan Kaufmann, San Mateo, CA 1990; 2: 524–32
- Feldman J A. Dynamic connections in neural networks. Biol. Cybern. 1982; 46
- Frean M. The Upstart algorithm: a method for constructing and training feedforward neural networks. Neural Comput. 1991; 2: 198–209
- Freeman J A, Skapura D M. Neural Networks, Algorithms, Applications and Programming Techniques. Addison-Wesley, Reading, MA 1991
- Hall L O, Romaniuk S G. A hybrid connectionist, symbolic learning system. AAAI-90. Boston, MA 1990
- Hanson S J, Pratt L Y. Some comparisons of constraints for minimal network construction with back-propagation. Advances in Neural Information Processing Systems, D S Touretzky. Morgan Kaufmann, San Mateo, CA 1989; 1
- Hassibi B, Stork D. Second order derivatives for network pruning: optimal brain surgeon. Advances in Neural Information Processing Systems. Morgan Kaufmann, San Mateo, CA 1992; 4
- Hirose Y, Koichi Y, Hijiya S. Back-propagation algorithm which varies the number of hidden units. Neural Networks 1991; 4: 61–6
- Honavar V, Uhr L. A network of neuron-like units that learns to perceive by generation as well as reweighting of its links. Proc. 1988 Connectionist Models Summer School, D S Touretzky, G Hinton, T Sejnowski. Morgan Kaufmann, San Mateo, CA 1988
- Le Cun Y, Denker J S, Solla S A. Optimal brain damage. Advances in Neural Information Processing Systems, D S Touretzky. Morgan Kaufmann, San Mateo, CA 1990; 2: 598–605
- Mozer M C, Smolensky P. Skeletonization: a technique for trimming the fat from a network via relevance assessment. Advances in Neural Information Processing Systems, D S Touretzky. Morgan Kaufmann, San Mateo, CA 1989; 1: 107–15
- Rissanen J. Stochastic Complexity in Statistical Inquiry. World Scientific, Singapore 1989
- Romaniuk S G, Hall L O. Divide and Conquer networks. Neural Networks 1992, submitted
- Sanger T D. A tree-structured adaptive network for function approximation in high dimensional spaces. IEEE Trans. Neural Networks 1991; 2: 285–93
- Shavlik J W, Dietterich T G. Readings in Machine Learning. Morgan Kaufmann, San Mateo, CA 1990
- Smotroff I G, Friedman D H, Connolly D. Self organizing modular neural networks. Proc. Int. Joint Conf. on Neural Networks, Seattle, WA. IEEE, Piscataway, NJ 1991; II-187-92
- Thrun S B, et al. The MONKs Problems-A performance comparison of different learning algorithms. Department of Computer Science, Carnegie Mellon University. 1991, Tech. Report CMU-CS-91-197
- Vapnik V N, Chervonenkis A Y. On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications 17 1971; 2: 264–80
- Vapnik V N. Inductive principles of the search for empirical dependences. Proc. 2nd Annual Workshop on Computational Learning Theory. Morgan Kaufmann, San Mateo, CA 1989; 3–21