On Cyclic Gradient Descent Reprojection

  • In recent years, convex optimization methods were successfully applied for various image processing tasks and a large number of first-order methods were designed to minimize the corresponding functionals. Interestingly, it was shown recently by Grewenig et al. that the simple idea of so-called “superstep cycles” leads to very efficient schemes for time-dependent (parabolic) image enhancement problems as well as for steady state (elliptic) image compression tasks. The ”superstep cycles” approach is similar to the nonstationary (cyclic) Richardson method which has been around for over sixty years. In this paper, we investigate the incorporation of superstep cycles into the gradient descent reprojection method. We show for two problems in compressive sensing and image processing, namely the LASSO approach and the Rudin-Osher-Fatemi model that the resulting simple cyclic gradient descent reprojection algorithm can numerically compare with various state-of-the-art first-order algorithms. However, due to the nonlinear projection within the algorithm convergence proofs even under restrictive assumptions on the linear operators appear to be hard. We demonstrate the difficulties by studying the simplest case of a two-cycle algorithm in R^2 with projections onto the Euclidian ball.

Download full text files

Export metadata

  • Export Bibtex
  • Export RIS

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Simon Setzer, Gabriele Steidl, Jan Morgenthaler
URN (permanent link):urn:nbn:de:hbz:386-kluedo-27423
Document Type:Preprint
Language of publication:English
Publication Date:2011/09/19
Year of Publication:2011
Publishing Institute:Technische Universität Kaiserslautern
Tag:convex optimization; gradient descent reprojection; superstep cycles
Faculties / Organisational entities:Fachbereich Mathematik
DDC-Cassification:510 Mathematik
Collections:Schriften der AG Mathematische Bildverbarbeitung und Datenanalyse

$Rev: 12793 $