Image restoration often requires the minimization of a convex, possibly nonsmooth functional, given by the sum of a data fidelity measure plus a regularization term. In order to face the lack of smoothness, alternative formulations of the minimization problem could be exploited via the duality principle. Indeed, the primal-dual and the dual formulation have been well explored in the literature when the data suffer from Gaussian noise and, thus, the data fidelity term is quadratic. Unfortunately, the most part of the approaches proposed for the Gaussian are difficult to apply to general data discrepancy terms, such as the Kullback-Leibler divergence. In this work we propose primal-dual methods which apply to the minimization of the sum of general convex functions and whose iteration is easy to compute, regardless of the form of the objective function, since it essentially consists in a subgradient projection step. We provide the convergence analysis and we suggest some strategies to improve the convergence speed by means of a careful selection of the steplength parameters. A numerical experience on Total Variation based denoising and deblurring problems from Poisson data shows the behavior of the proposed method with respect to other state-of-the-art algorithms.

Primal-dual first order methods for total variation image restoration in presence of Poisson noise

BONETTINI, Silvia;BENFENATI, ALESSANDRO;RUGGIERO, Valeria
2014

Abstract

Image restoration often requires the minimization of a convex, possibly nonsmooth functional, given by the sum of a data fidelity measure plus a regularization term. In order to face the lack of smoothness, alternative formulations of the minimization problem could be exploited via the duality principle. Indeed, the primal-dual and the dual formulation have been well explored in the literature when the data suffer from Gaussian noise and, thus, the data fidelity term is quadratic. Unfortunately, the most part of the approaches proposed for the Gaussian are difficult to apply to general data discrepancy terms, such as the Kullback-Leibler divergence. In this work we propose primal-dual methods which apply to the minimization of the sum of general convex functions and whose iteration is easy to compute, regardless of the form of the objective function, since it essentially consists in a subgradient projection step. We provide the convergence analysis and we suggest some strategies to improve the convergence speed by means of a careful selection of the steplength parameters. A numerical experience on Total Variation based denoising and deblurring problems from Poisson data shows the behavior of the proposed method with respect to other state-of-the-art algorithms.
2014
9781479957514
e-subgradient projection method; Kullback-Leibler divergence; Primal-Dual method; Total Variation; variable steplengths;
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in SFERA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11392/2291427
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 10
  • ???jsp.display-item.citation.isi??? 10
social impact