Optimierung mit „Knick“ : Nichtglatte Optimierung in der Bildverarbeitung

Clason, Christian GND

Nichtglatte Optimierung beschäftigt sich mit der Minimierung von Funktionen, die nicht im klassischen Sinn differenzierbar sind. Christian Clason gibt einen Einblick in moderne Verfahren und ihre Anwendung in der mathematischen Bildverarbeitung.

Modern variational methods in mathematical imaging formulate the task of denoising, deblurring, or reconstructing images – photographic or medical – as an optimization problem in which the goal is to find the “optimal” image that minimizes a weighted sum of a discrepancy term, which measures the distance of a candidate image to the given data, and a regularization term, which measures the abstract “goodness” of the candidate image. For the latter, the total variation has turned out to be especially suitable for images, since it allows jumps in the minimizer (corresponding to sharp edges in the image) while still imposing sufficient regularity otherwise to remove noise. However, this term is not differentiable, and hence efficient nonsmooth optimization methods are needed. For convex functions such as the total variation, techniques of convex analysis make it possible to characterize minimizers as fixed points that can be computed either through fixed-point iteration – leading to proximal point and splitting methods – or through iterative linearization – leading to semismooth Newton methods. Current research focuses on generalizing these approaches to nonconvex functions.

Share and cite

Citation style:

Clason, Christian: Optimierung mit „Knick“. Nichtglatte Optimierung in der Bildverarbeitung. 2019.

Could not load citation form. Default citation form is displayed.

Rights

Use and reproduction:
All rights reserved

Export