Registration open !

The “Mathematical Foundations of AI” workshop, jointly organized by Institut DataIA and SCAI, in collaboration with scientific societies — the Jacques Hadamard Mathematical Foundation (FMJH), the Fondation Sciences Mathématiques de Paris (FSMP), the MALIA group of the French Statistical Society, and the French-speaking Learned Society for Machine Learning (SSFAM) — aims to provide an overview of several promising research directions at the interface between statistical learning and artificial intelligence.

This event is part of the Maths & AI network in the Île-de-France region, in which FMJH and DataIA are key partners.

This new edition will focus on identifiability issues, whether in tensor analysis, neural networks, or generative AI. The day will feature three plenary lectures delivered by renowned researchers and leading experts in the field:

The workshop also offers an opportunity for young researchers to present their work through short talks (see the call for contributions).

 

Call for Contributions

As part of the workshop, participants are invited to submit a detailed abstract for a possible oral presentation or poster session. During the selection process, the committee aims to provide maximum visibility to PhD students, researchers, and faculty members.

When submitting your application by email (maths-ia@inria.fr), please include the following information:

Submission deadline: November 21, 2025.

 

Program

9:00 – 10:00 | Keynote 1: François Malgouyres (Université de Toulouse)

Geometry-Induced Regularization and Identifiability of Deep ReLU Networks

Abstract:
The first part of this talk will present, through a simple and pedagogical example, the mathematical results developed in the second part, in order to make the underlying intuition accessible to a broad audience. Due to an implicit regularization that favors “good” networks, neural networks with a large number of parameters generally do not overfit. Among the related and still poorly understood phenomena are the properties of flat minima, saddle-to-saddle dynamics, and neuron alignment.

To analyze these phenomena, we study the local geometry of deep ReLU networks. We show that, for a fixed architecture, as the weights vary, the image of a sample XXX forms a set whose local dimension changes. The parameter space is thus partitioned into regions where this local dimension remains constant. The local dimension is invariant under the natural symmetries of ReLU networks (i.e., positive rescaling and neuron permutation).

We then establish that the geometry of the network induces a regularization effect, with the local dimension serving as a key measure of regularity. Furthermore, we relate the local dimension to a new notion of flatness of minima and to saddle-to-saddle dynamics. For one-hidden-layer networks, we also show that the local dimension is linked to the number of linear regions perceived by XXX, shedding light on the effect of regularization. This result is supported by experiments and connected to neuron alignment. Finally, we present experiments on the MNIST dataset, highlighting geometry-induced regularization in this context. The talk concludes by linking properties of the local dimension to the local identifiability of network parameters.

Biography:
François Malgouyres is a Professor at the University of Toulouse (France). His research focuses on the theoretical and methodological foundations of deep learning, with particular interest in understanding the mathematical structure of neural networks. His work includes studies on network geometry, parameter identifiability, function approximation by neural networks, weight quantization in recurrent networks, and the design of orthogonal convolutional layers. He has also investigated the straight-through estimator — the reference algorithm for optimizing quantized weights — and its applications to sparse signal reconstruction.

Before joining the University of Toulouse, François Malgouyres was an Associate Professor at Université Paris Nord, a postdoctoral researcher at UCLA, and obtained his PhD at ENS Paris-Saclay (then located in Cachan).

10:00 – 10:30 | Coffee Break

10:30 – 11:30 | Keynote 2: Elisabeth Gassiat (Laboratoire de Mathématiques d'Orsay)
TBA

11:30 – 12:30 | Short Contributive Talks (3 × 15 min)

12:30 – 13:45 | Lunch Break

13:45 – 14:45 | Keynote 3: Pavlo Mozharovskyi (Télécom ParisTech)
TBA

14:45 – 15:30 | Afternoon Break

15:30 – 17:00 | Short Contributive Talks (6 × 15 min)