Convergence rates for variational regularization of inverse problems with stochastic data
Thursday 16 May 2019
Koll. D (1531-211)
We consider variational regularization methods for mildly ill-posed inverse problems described by operator equations $F(f)=g$ in Hilbert spaces. One focus of this talk will be on statistical data noise models: We will present a general framework which allows to treat many noise models and data fidelity terms in a unified setting, including Gaussian and Poisson and empirical processes and show (nearly) optimal rates of convergence as the noise level tends to zero. Such inverse problems for example occur in parameter identification problems in stochastic differential equations and Photonic imaging etc. The main tool of our analysis are large deviation inequalities for stochastic processes in negative Besov norms.
Contact: Jevgenijs Ivanovs