14
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Global optimization with a new method based on the gradient direction vector without constrained derivable objective functions

, &
Pages 1783-1797 | Received 01 Dec 2020, Published online: 17 Dec 2021
 

Abstract

Gradient descent is a popular method in optimization. This method can be used in the case of a derivable and without constraint function. But it encountered several obstacles (the choice of initial point and the type of objective function, the solution, global or local). In this work, we propose an optimization method that is used to calculate the global optimization of mono-objective optimization functions that can be derived without constraints. This method is inspired by the gradient descent method. Their principle is to use a new entity based on the gradient direction instead of gradient. Unlike the gradient method and other methods inspired by gradient descent (conjugate gradient method and Newton method) where the choice of initial point is made in a random way. In our case, the choice of initial point is fixed at the beginning and also the form of the objective function curve is not a problem. Finally, we will end by applying our method to some examples (nonlinear function, quadratic function (not-convex, convex).

Subject Classification:

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.