Abstract
This paper examines the chaotic behavior of Back Propagation neural networks during the training phase. The networks are trained using ordinary parameter values, while two different cases are considered. In the first one, the network does not meet desirable convergence within a pre-specified number of epochs. Chaotic behavior of this network is depicted by examining the values of the dominant Lyapunov exponents of the weight data series produced by additional training. For each training epoch, the data series representing input patterns producing the minimum absolute error in output during additional training, is also subjected to Lyapunov exponent investigation. The task of this investigation is to determine whether the network exhibits chaotic pattern competition of the best learned inputs. In the second case, the network is improved and desirable convergence is accomplished. Again, investigation focuses on the series of values representing input patterns that produce outputs with minimum absolute error. The results obtained from dominant Lyapunov exponent estimations show that chaotic pattern competition is still present, despite the fact that the network practically satisfies stability demands within predetermined accuracy limits. The best estimation series consist of the output values corresponding to the best learned input patterns. These series are examined using the theoretical tool of topological conjugacy, in addition to numerical verification of the results.
C. R. Category::