References
- Judd S. Learning in networks is hard. IEEE First Int. Conf. on Neural Networks, San Diego, California, June, 21-24, 1987; Vol 11: 685–92, IEEE catalogue number 87TH0191-7
- Rumelhart D E, McClelland J L. Parallel Distributed Processing. MIT Press, Cambridge, MA 1986; vols 1-2
- Kolen J F, Pollack J B. Backpropagation is sensitive to initial conditions. Complex Systems 1990; 4: 269–80
- Fahlman S E, Lebiere C. The cascade-correlation learning architecture. Advances in Neural Information Processing Systems 2. Morgan Kaufmann, San Mateo, CA 1990; 524–32
- Frean M. The upstart algorithm: a method for constructing and training feedforward neural networks. Neural Computation 1990; 2: 198–209
- Gallant S I. Perceptron-based learning algorithms. IEEE Trans. Neural Networks 1990; 1: 179–91
- Golea M, Marchand M. A growth algorithm for neural network decision trees. Europhys. Lett. 1990; 12: 205–10
- Keibek S A J, Barkema G T, Andree H M A, Savenije M H F, Taal A. A fast partitioning algorithm and a comparison of binary feedforward neural networks. Europhys. Lett. 1992; 18: 555–9
- Marchand M, Golea M, Ruján P. A convergence theorem for sequential learning in two-layer perceptrons. Europhys. Lett. 1990; 11: 487–92
- Marchand M, Golea M. On learning simple neural concepts: from halfspace intersections to neural decision lists. Network 1992; 4: 67–85
- Martinez D, Estève D. The offset algorithm: building and learning method for multilayer neural networks. Europhys. Lett. 1992; 18: 95–100
- Mézard M, Nadal J-P. Learning in feedforward layered networks: the tiling algorithm. J. Phys. A. Math. Gen. 1989; 22: 2191–203
- Ruján P, Marchand M. Learning by minimizing resources in neural networks. Complex Systems 1989; 3: 229–41
- Sirat J A, Nadal J-P. Neural trees: a new tool for classification. Network 1990; 1: 423–38
- Minsky M L, Papert S A. Perceptrons. MIT Press, Cambridge, MA 1969
- Duda R O, Hart P E. Pattern Classification and Scene Analysis. Wiley, New York 1973
- Gallant S I. Optimal linear discriminants. IEEE Proc. VIII Int. Conf. on Pattern Recognition, ParisFrance, 1986; 849–52
- Bottou L, Vapnik V. Local learning algorithms. Neural Computation 1992; 4: 888–900
- Andree H M A, Barkema G T, Lourens W, Taal A, Vermeulen J C. A comparison study of binary feedforward neural networks and digital circuits. Neural Networks 1993, in press
- Booth T L. Digital Networks and Computer Systems. Wiley, New York 1971
- Parzen E. On estimation of a probability density function and mode. Annals of Mathematical Statistics 1962; 33: 1065–76
- Schiøler H, Hartmann U. Mapping neural networks derived from the Parzen window estimator. Neural Networks 1992; 5: 903–09
- Specht D F. Probabilistic neural networks. Neural Networks 1990; 3: 109–18
- Koonin S E. Computational Physics. Benjamin Cummings, New York 1986
- Metropolis N, Rosenbluth A W, Rosenbluth M N, Teller A H, Teller E. J. Chem. Phys. 1953; 21: 1087–92
- Barkema G, de Boer J. Numerical study of phase transitions in Potts models. Phys. Rev. A. 1991; 44: 8000–5
- Vapnik V N, Chervonenkis A Y A. On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probabilities and its Applications. 1971; 16: 264–80
- Vapnik V. Estimation of Dependences Based on Empirical Data. Springer, Berlin 1982
- Baum E B, Haussler D. What size net gives valid generalisation?. Neural Computation 1989; 1: 151–60