Nos partenaires

CNRS

Rechercher





Accueil du site > Français > Evénements > Séminaires

Séminaires

 

L’IRIT étant localisé sur plusieurs sites, ses séminaires sont organisés et ont lieu soit à l’Université Toulouse 3 Paul Sabatier (UT3), l’Université Toulouse 1 Capitole (UT1), l’INP-ENSEEIHT ou l’Université Toulouse 2 Jean Jaurès (UT2J).

 

Fresh views on convex optimization : clustered optimization and integration methods of the gradient flow

Vincent ROULET - INRIA Paris (France)

Lundi 29 Mai 2017, 14h00 - 15h30
INP-ENSEEIHT, Salle des thèses
Version PDF :

Abstract

In this talk I will present how convex optimization, an old and rather well understood field, can serve as a basis to tackle new problems. First approach is bottom-up : start from a practical combinatorial problem and tackle it with convex optimization. Second approach is top-down : go from the theory of integration methods to get a better understanding of convex optimization in order to generalize it to new settings.

Precisely, in a first part, I will present the problem of grouping variables for a prediction task. For example when predicting ratings of a movie from its reviews, one may want to group synonyms that has the same influence. It forms an alternative to sparse optimization where the goal is to select variables. Several optimization schemes and a statistical analysis will be provided that extend traditional analysis of sparse optimization and open new questions to partition variables for a prediction task.

In a second part, I will present how optimization algorithms can be seen as integration methods of the gradient flow equation. It may serve as an introduction to convex optimization and explain the acceleration phenomenon of Nesterov's optimal algorithm : it integrates the gradient flow equation with bigger steps. The potential extension to non-convex optimization will be briefly discussed.


Bio: I am a 3rd year PhD student under the supervision of Alexandre d'Aspremont in the SIERRA Team of INRIA Paris directed by Francis Bach. Previously I received my Master degree form the Master Mathématiques, Vision et Apprentissage (MVA) at ENS Cachan and my engineering degree from Ecole Polytechnique. During my thesis I worked on structural assumptions for machine learning problems : how the information needed for a prediction task can be compressed during the optimization procedure. This led me to analyze of geometric properties of sparse optimization problems and then to study convex optimization through integration methods.

 

Retour