Abstract
Most supervised neural networks are trained by minimizing the mean squared error of the training set. But there are problems of using mean squared error especially whenever the target output is equal to the actual output in which the error signal tends to zero. This will lead to the instability of the internal structure of the network as well. In this paper, we discuss an improved convergence rates of standard backpropagation model with some modifications in its learning strategies. A modified backpropagation model is experimented on XOR problem, data of profitability analysis and Kuala Lumpur Composite Index (KLCI) at Kuala Lumpur Stock Exchange (KLSE) and handwritten/handprinted digits. The results are compared with standard backpropagation model which is based on mean square errors
∗Corresponding author
∗Corresponding author
Notes
∗Corresponding author