We have just posted a new preprint on arXiv, called “A Random Matrix Perspective on Random Tensors,” which concerns large-dimensional random tensor models. Our main contribution is methodological: we show that such models can be studied by borrowing tools from random matrix theory. The key idea is to consider contractions of a given random tensor, which effectively give rise to random matrices whose spectra are related back to spectral properties of the tensor itself. In particular, in the case of a symmetric rank-one Gaussian model, this allows reaching certain precise predictions regarding the performance of the maximum likelihood estimator of the “planted” vector (also known as spike), which had been hitherto achieved only via statistical physics methods. This viewpoint also brings interesting insights regarding the landscape of the maximum likelihood problem. Our approach is quite versatile and can be extended to asymmetric, non-Gaussian and higher-order models. This is a collaboration with Romain Couillet and Pierre Comon from Gipsa-lab. Check out the paper for more details!