## Minimization and Parameter Estimation for Seminorm Regularization Models with I-Divergence Constraints

• This papers deals with the minimization of seminorms $$\|L\cdot\|$$ on $$\mathbb R^n$$ under the constraint of a bounded I-divergence $$D(b,H\cdot)$$. The I-divergence is also known as Kullback-Leibler divergence and appears in many models in imaging science, in particular when dealing with Poisson data. Typically, $$H$$ represents here, e.g., a linear blur operator and $$L$$ is some discrete derivative operator. Our preference for the constrained approach over the corresponding penalized version is based on the fact that the I-divergence of data corrupted, e.g., by Poisson noise or multiplicative Gamma noise can be estimated by statistical methods. Our minimization technique rests upon relations between constrained and penalized convex problems and resembles the idea of Morozov's discrepancy principle. More precisely, we propose first-order primal-dual algorithms which reduce the problem to the solution of certain proximal minimization problems in each iteration step. The most interesting of these proximal minimization problems is an I-divergence constrained least squares problem. We solve this problem by connecting it to the corresponding I-divergence penalized least squares problem with an appropriately chosen regularization parameter. Therefore, our algorithm produces not only a sequence of vectors which converges to a minimizer of the constrained problem but also a sequence of parameters which convergences to a regularization parameter so that the penalized problem has the same solution as our constrained one. In other words, the solution of this penalized problem fulfills the I-divergence constraint. We provide the proofs which are necessary to understand our approach and demonstrate the performance of our algorithms for different image restoration examples.

$Rev: 13581$