Abstract
Direct Search algorithms are classical derivative-free methods for optimization. Though endowed with solid theoretical properties, they are not well suited for large-scale problems due to slow convergence and scaling issues. In this paper, we discuss how, on problems for which a hierarchy of objective functions is available, such limitations can be circumvented by using multilevel schemes which are able to accelerate the computation of a finest level solution. Starting from a previously introduced derivative-free multilevel method, based on Coordinate Search optimization with a sampling strategy of Gauss–Seidel type, we consider also the use of sampling strategies of Jacobi type, and present several algorithmic variations. We justify our choices by performing experiments on two model problems, showing that a performance close to multigrid optimality can be observed in practice.
Acknowledgements
We want to thank the two anonymous referees and the associate editor for their constructive criticism and suggestions, which greatly helped to improve the overall quality of the paper. The second author also acknowledges partial support from the Italian MIUR.
Disclosure statement
No potential conflict of interest was reported by the authors.
Notes
1. Formally, a set of vectors positively spans if for any there exist s.t. .
2. We are not taking into consideration the line search, as it is not essential for convergence in this case [Citation12].