187
Views
35
CrossRef citations to date
0
Altmetric
Original Articles

Comparison of advanced large-scale minimization algorithms for the solution of inverse ill-posed problems

, &
Pages 63-87 | Received 22 Jul 2007, Published online: 04 Mar 2011
 

Abstract

We compare the performance of several robust large-scale minimization algorithms for the unconstrained minimization of an ill-posed inverse problem. The parabolized Navier–Stokes equation model was used for adjoint parameter estimation.

The methods compared consist of three versions of nonlinear conjugate-gradient (CG) method, quasi-Newton Broyden–Fletcher–Goldfarb–Shanno (BFGS), the limited-memory quasi-Newton (L-BFGS) [D.C. Liu and J. Nocedal, On the limited memory BFGS method for large scale minimization, Math. Program. 45 (1989), pp. 503–528], truncated Newton (T-N) method [S.G. Nash, Preconditioning of truncated Newton methods, SIAM J. Sci. Stat. Comput. 6 (1985), pp. 599–616, S.G. Nash, Newton-type minimization via the Lanczos method, SIAM J. Numer. Anal. 21 (1984), pp. 770–788] and a new hybrid algorithm proposed by Morales and Nocedal [J.L. Morales and J. Nocedal, Enriched methods for large-scale unconstrained optimization, Comput. Optim. Appl. 21 (2002), pp. 143–154].

For all the methods employed and tested, the gradient of the cost function is obtained via an adjoint method. A detailed description of the algorithmic form of minimization algorithms employed in the minimization comparison is provided.

For the inviscid case, the CG-descent method of Hager [W.W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and efficient line search, SIAM J. Optim. 16 Equation(1) (2005), pp. 170–192] performed the best followed closely by the hybrid method [J.L. Morales and J. Nocedal, Enriched methods for large-scale unconstrained optimization, Comput. Optim. Appl. 21 (2002), pp. 143–154], while in the viscous case, the hybrid method emerged as the best performed followed by CG [D.F. Shanno and K.H. Phua, Remark on algorithm 500. Minimization of unconstrained multivariate functions, ACM Trans. Math. Softw. 6 (1980), pp. 618–622] and CG-descent [W.W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and efficient line search, SIAM J. Optim. 16 Equation(1) (2005), pp. 170–192]. This required an adequate choice of parameters in the CG-descent method as well as controlling the number of L-BFGS and T-N iterations to be interlaced in the hybrid method.

AMS Subject Classification :

Acknowledgements

The authors acknowledge the support from the School of Computational Science, Florida State University. The expert comments of two anonymous reviewers and the suggestions of Dr William Hager sizably improved the presentation and content of the paper and are gratefully acknowledged. Professor Navon acknowledges the support from NSF grants number ATM-0201808, managed by Dr Linda Pang, and CCF-0635162, managed by Dr Eun K. Park.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 1,330.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.