Emerging fields such as data analytics, machine learning, and uncertainty quantification heavily rely on efficient computational methods for solving inverse problems. With growing model complexities and ever-increasing data volumes, state-of-the-art inference methods exceed their limits of applicability and novel methods are urgently needed.
In this talk, we present novel methods for the broad spectrum of inverse problems where the aim is to reconstruct quantities with a sparse representation on some vector space. The associated optimization problems with L1 regularization have received significant attention, due to their wide applicability in compressed sensing, dictionary learning, and imaging problems, to name a few. We present a new method based on variable projection and describe a new approach that uses deep neural networks (DNNs) to obtain regularization parameters for solving inverse problems.
The aim of this talk is to engage students and faculty alike and initiate a discussion about future directions of computational mathematics in the world of big data and machine learning.
DoMSS Seminar
Monday April 3
1:30 pm
WXLR A302
Matthias Chung
Associate Professor
Department of Mathematics
Emory University