Abstract
We obtain a sharp lower bound estimate for the approximation error of a continuous function by single hidden layer neural networks with a continuous activation function and weights varying on two fixed directions. We show that for a certain class of activation functions this lower bound estimate turns into equality. The obtained result provides us with a method for direct computation of the approximation error. As an application, we give a formula, which can be used to compute instantly the approximation error for a class of functions having second order partial derivatives.
Acknowledgments
A part of this work was done during the author’s visit to Gebze Technical University. The author would like to thank Professor Mansur Isgenderoglu for his invitation, warm hospitality and interesting discussions.