93
Views
1
CrossRef citations to date
0
Altmetric
Articles

Computing the Approximation Error for Neural Networks with Weights Varying on Fixed Directions

Pages 1395-1409 | Received 27 Oct 2018, Accepted 06 Apr 2019, Published online: 30 Apr 2019
 

Abstract

We obtain a sharp lower bound estimate for the approximation error of a continuous function by single hidden layer neural networks with a continuous activation function and weights varying on two fixed directions. We show that for a certain class of activation functions this lower bound estimate turns into equality. The obtained result provides us with a method for direct computation of the approximation error. As an application, we give a formula, which can be used to compute instantly the approximation error for a class of functions having second order partial derivatives.

2000 Mathematics Subject Classification:

Acknowledgments

A part of this work was done during the author’s visit to Gebze Technical University. The author would like to thank Professor Mansur Isgenderoglu for his invitation, warm hospitality and interesting discussions.

Additional information

Funding

This research was supported by the Science Development Foundation under the President of the Republic of Azerbaijan (grant no. EIF/MQM/Elm-Tehsil-1-2016-1(26)-71/08/01).

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 570.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.