Abstract
In this paper, we present a globally convergent algorithm for the computation of a minimizer of the Tikhonov functional with sparsity promoting penalty term for nonlinear forward operators in Banach space. The dual TIGRA method uses a gradient descent iteration in the dual space at decreasing values of the regularization parameter , where the approximation obtained with serves as the starting value for the dual iteration with parameter . With the discrepancy principle as a global stopping rule, the method further yields an automatic parameter choice. We prove convergence of the algorithm under suitable step-size selection and stopping rules and illustrate our theoretic results with numerical experiments for the nonlinear autoconvolution problem.
Notes
1 W. Wang was supported by Zhejiang Provincial NSFC [LQ14A010013], S. Anzengruber by the German Science Foundation DFG [HO 1454/8-1], R. Ramlau by the Austrian Science Fund [W1214] and B. Han by NSFC [91230119].