Given by Dejan SlepĨev (Carnegie Mellon University, Pittsburgh) from Tuesday, November 16 to Friday, November 19, 2021, the Jacques-Louis Lions 2021 Lessons will consist of:

- a mini-course
Variational problems and PDE on random structures : analysis and applications to data science
3 sessions, Tuesday 16, Wednesday 17 and Thursday 18 November 2021 from 12:00 to 13:15,

- and a colloquium
Machine learning meets calculus of variations
Friday, November 19, 2021 from 2:00 to 3:00 pm.

The presentations will be broadcasted in real time by Zoom.
The locations will be specified later.

Mini-course abstract
Variational problems and PDE on random structures : analysis and applications to data science
Many machine learning tasks, such as clustering, regression, classification, and dimensionality reduction are commonly described as optimization problems. Namely these tasks are modeled by introducing functionals (defined using the available random sample) which specify the desired properties of the object sought. While the data are often high dimensional, they usually have an intrinsic low-dimensional structure that makes the learning tasks feasible. The intrinsic geometric structures is often encoded by a graph created by connecting the nearby data points. We will introduce mathematical tools used to study variational problems and PDE-based models posed on random data samples. In particular we will discuss the passage from discrete optimization problems on random samples to their continuum limits. This will be used to establish asymptotic consistency of several important machine learning algorithms.
We will cover the basic elements of the background material on calculus of variations and optimal transportation. Furthermore we will develop connections to nonlocal functionals which serve as intermediate objects between the discrete functionals and their continuum limits. We will also consider approaches based on dynamics on graphs and connect these with the evolution equations describing the continuum limits.

Colloquium abstract
Machine learning meets calculus of variations
Modern data-acquisition technology produces a wealth of data about the world we live in. The goal of machine learning is to extract and interpret the information the data sets contain. This leads to variety of learning tasks, many of which seek to optimize a functional, defined on the available random sample.
The functionals take as the input the available data samples, yet we seek to make conclusions about the true distribution of data. To compare the outcomes based on finite data and the ideal outcomes that one would have if full information is available, we study the asymptotic properties of the discrete optimization problems based on finite random samples. We will discuss how calculus of variations and partial differential equations provide tools to compare the discrete and continuum descriptions for many relevant problems. Furthermore, we will discuss how the insights from analysis can be used to guide the design of the functionals used in machine learning.