28
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Improving learning in neural networks through weight initializations

, &
Pages 951-971 | Received 01 Mar 2020, Published online: 13 Jun 2021
 

Abstract

Feedforward neural networks are known to exhibit universal approximation property, i.e., these networks can effectively approximate any function arbitrary well, provided there is one hidden layer with sufficient number of hidden nodes, and some non-linearity at the hidden layer nodes. The property of universal approximation in these networks contributed to wide range of application, remarkably in the area of predictive modelling. However, these networks have a major drawback of slow convergence. In this work, we proposed a new weight initialization algorithm to improve the convergence speed in feedforward networks using statistical analysis of system, based on the set of predefined assumptions. The proposed algorithm is compared to other established weight initialization algorithm for predictive models, and is expected to perform better than other algorithms.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.