Head:
Jia-Jie Zhu

Coworkers:
Zusen Xu

Secretary:
Christine Schneider


The group focuses on research in state-of-the-art machine learning and optimization. For example, we are interested in the robustness theory for machine learning and optimization. That requires us to use computational optimization tools that can manipulate probability distributions, which are inherently infinite-dimensional. We mainly work on variational methods for machine learning and optimization over probability distributions, rooted in the theory of gradient flows and optimal transport of probability measure.

Highlights

March 11th - 15th, 2024. Organization of the Workshop on Optimal Transport from Theory to Applications ? Interfacing Dynamical Systems, Optimization, and Machine Learning (OT-DOM) in Berlin, Germany. program and slides


Aims

Our goal is interfacing large-scale computational algorithms in machine learning/optimization with dynamical system theory such as (PDE) gradient flows and optimal transport. The focused topics include the optimization and machine learning applications of the (Wasserstein-)Fisher-Rao, a.k.a. (Spherical-)Hellinger-Kantorovich, gradient flows, kernel methods for dynamical systems and optimization over probability distributions, robust learning and optimization under distribution shift and causal confoundings.