CIMI thematic trimester “Beyond classical regimes in statistical inference and machine learning”

CIMI thematic trimester “Beyond classical regimes in statistical inference and machine learning”

I am thrilled to announce that with Xiaoyi Mai, Mohamed Tamaazousti and Vincent Lahoche we are organizing a CIMI thematic trimester titled “Beyond classical regimes in statistical inference and machine learning,” to be held during Fall 2024. Here is a summary of its scientific program:

  • (September 2024, precise date TBA) An opening colloquium about recent developments and challenges of high-dimensional statistical inference and machine learning.
  • (From 7 October to 11 October 2024) A one-week thematic school with the theme “Models”, featuring mini-courses on tools and techniques for the analysis of high-dimensional models in statistical inference and machine learning, including (but not limited to) tools from random matrix theory, statistical physics and spin glasses.
  • (From 14 October to 18 October 2024) A one-week thematic school with the theme “Optimization”, featuring mini-courses on the study of high-dimensional random optimization landscapes and on the dynamics of optimization algorithms in high dimensions.
  • (From 4 November to 8 November 2024) A workshop focused on recent results on high-dimensional (supervised and unsupervised) machine learning and statistical inference. This workshop will in particular involve a round-table debate with top experts on this domain about the major open problems on the field and some promising trends and recent developments.

More details (and a website) are coming soon, in particular regarding a call for poster contributions to be presented during the workshop.

Our chair on large random tensors was selected for ANITI 2.0!

Our chair on large random tensors was selected for ANITI 2.0!

I’m excited to announce that our chair titled “LArge Tensors for daTa analysIs and maChine lEarning” (LATTICE) was selected for the second version of ANITI, the AI institute of Toulouse. It will thus be part of ANITI’s proposal for setting up an AI cluster in Toulouse, in the sense of the currently open AnR call.

This chair, held by myself (as the PI) and Xiaoyi Mai of IMT (as a co-chair), is devoted to the study of large random tensor models, their estimation, and their application to (unsupervised) machine learning. An abstract is available at ANITI’s website, along with a list of all the other selected chairs.

More info on that will be available later this year.

Internship on the analysis of tensor-based methods for machine learning

Internship on the analysis of tensor-based methods for machine learning

We are looking for strongly motivated M2 students with a solid background on probability, statistics and optimization for a (5- or) 6-month internship in 2023, focused on the analysis of tensor decomposition methods for machine learning in the large-data regime. The student will be co-supervised by myself and Xiaoyi Mai, Assistant Professor at the University of Toulouse and member of the Institut de Mathématiques de Toulouse (IMT).

A detailed description can be found here.

Ph.D. position in statistical analysis / statistical signal processing

Ph.D. position in statistical analysis / statistical signal processing

Together with Nicolas Dobigeon, we are looking for strongly motivated candidates with a solid background on statistics, linear algebra and optimization for a 3-year Ph.D. position on the subject “Simplex component analysis in the small-data regime for spectral imaging“. The student will be a member of the Signals & Communications team of IRIT, located at the ENSEEIHT site.

More details on this offer can be found at this link.

Paper “A Random Matrix Perspective on Random Tensors” accepted by JMLR

Paper “A Random Matrix Perspective on Random Tensors” accepted by JMLR

We are pleased to announce that our paper where we propose the use of RMT tools to study random tensor models has been accepted by JMLR!

The central idea in our approach is to study the spectrum of matrix arising from contractions of a random tensor models with well-chosen directions. In particular, in the JMLR paper we leverage this idea to derive a novel characterization of the performance of maximum likelihood estimation of a symmetric rank-one Gaussian tensor model. Despite the fact that previous studies using tools from spin glass theory had already established a precise characterization of MLE for this model, our approach is more elementary and accessible, and we believe it will more easily lend itself to the study of other, more structured and more general tensor models.

As a matter of fact, Mohamed Seddik, Romain Couillet and Maxime Guillaud have already extended it to rank-one asymmetric Gaussian models, obtaining novel results on MLE for these models and on the spectrum of contractions of the associated ensemble (see https://arxiv.org/abs/2112.12348).

Check out our preprint on arXiv for more details : https://arxiv.org/pdf/2108.00774.pdf

M2 Internship (5 to 6 months) on tensor-based machine learning methods

M2 Internship (5 to 6 months) on tensor-based machine learning methods

We look for strongly motivated candidates with a solid background on mathematics and statistics, having good programming skills in scientific computing languages (Python, Matlab, Julia), for an M2 internship on the performance of tensor-based machine learning methods for large-scale data.

The internship shall last 5 to 6 months, starting from Spring 2022, and will take place at the ENSEEIHT site of IRIT, in Toulouse. It will be co-supervised by Henrique Goulart (Assistant Professor at Toulouse INP) and Rodrigo Cabral (Assistant Professor at Polytech Nice), and will be fully funded by a CIMI grant.

For more details on this offer, please check the detailed description at https://cloud.irit.fr/index.php/s/FyzGeSlAT9LfSkw

New preprint on random tensors

New preprint on random tensors

We have just posted a new preprint on arXiv, called “A Random Matrix Perspective on Random Tensors,” which concerns large-dimensional random tensor models. Our main contribution is methodological: we show that such models can be studied by borrowing tools from random matrix theory. The key idea is to consider contractions of a given random tensor, which effectively give rise to random matrices whose spectra are related back to spectral properties of the tensor itself. In particular, in the case of a symmetric rank-one Gaussian model, this allows reaching certain precise predictions regarding the performance of the maximum likelihood estimator of the “planted” vector (also known as spike), which had been hitherto achieved only via statistical physics methods. This viewpoint also brings interesting insights regarding the landscape of the maximum likelihood problem. Our approach is quite versatile and can be extended to asymmetric, non-Gaussian and higher-order models. This is a collaboration with Romain Couillet and Pierre Comon from Gipsa-lab. Check out the paper for more details!

Accepted paper at EUSIPCO 2021

Accepted paper at EUSIPCO 2021

Glad to announce that our paper with Phillip Burt (Escola Politécnica of University of São Paulo) titled “Volterra kernels of bilinear systems have tensor train structure” has been accepted by EUSIPCO 2021! It will be presented at the special session on Tensor and Matrix Methods organized by Rémy Boyer, Nicolas Gillis and Xiao Fu.

In this paper, we show that the Volterra kernels of any bilinear system have a natural and exact tensor-train structure that can be exploited to obtain a low-cost discrete-time “Volterra-like” model for predicting its outputs. This is a valuable property since discrete-time Volterra models (essentially polynomial models with memory) are quite easy to implement and can be made stable by construction by simply truncating their memory; on the other hand, the obtained realization gets very costly with the memory (in samples) and the degree. Furthermore, by virtue of the Carleman bilinearization, this idea can be readily extended to systems with more general nonlinear differential equation that is linear in the input and involves only analytic functions (so-called linear-analytic systems, even though this terminology can be a little misleading).

A preprint is available on HaL : https://hal.archives-ouvertes.fr/hal-03233382v1

Invited talk at “Random tensors at CIRM”

Invited talk at “Random tensors at CIRM”

I’m happy to announce that I’ve been invited to give a talk at a conference on random tensors, which will be held at the Centre International de Rencontres Mathématiques (CIRM), in Marseille, on March 2022!

The program of this event will be quite rich, featuring several nice talks and mini-lectures by mathematicians and physicists working on topics related to tensors and random tensors — check out the website for the detailed program: https://tensors-2022.sciencesconf.org/.