Lenka Zdeborova - Insights on gradient-based algorithms in high-dimensional non-convex learning.


Gradient descent algorithms and their noisy variants, such as the Langevin dynamics or multi-pass SGD, are at the center of attention in machine learning. Yet their behaviour remains perplexing, in particular in the high-dimensional non-convex setting. In this talk, I will present several high-dimensional and non-convex statistical learning problems in which the performance of gradient-based algorithms can be analysed down to a constant. The common point of these settings is that the data come from a probabilistic generative model leading to problems for which, in the high-dimensional limit, statistical physics provides exact closed solutions for the performance of the gradient-based algorithms. The covered settings include the spiked mixed matrix-tensor model, the phase retrieval, and high-dimensional Gaussian mixture. Based on: arXiv:1812.09066 (PRX), arXiv:1902.00139 (ICML19), arXiv:1907.08226 (NeurIPS19), arXiv:2006.06560 (NeurIPS20), arXiv:2006.06997 (NeurIPS20), arXiv:2006.06098, (NeurIPS20)