Maxime Vono

Welcome

I am a Ph.D. student since October, 2017 in Toulouse (France). I am supervised by Nicolas Dobigeon and Pierre Chainais, within the SC group of the IRIT laboratory. I am also affiliated to the ORION-B project.

Prior to that, I graduated from Ecole Centrale de Lille majoring in data science. I also hold a M.Sc. degree in applied mathematics from University of Lille.

I work on Monte Carlo methods for statistical machine learning and signal processing. I am particularly interested in the connections between optimisation and simulation-based approaches.

News/Events:
May 2019 Submitted paper on ADMM-type MCMC sampling with dimension-free convergence rate.
March-April 2019 I will visit Arnaud Doucet's research group at University of Oxford.
February 2019 Submitted paper on asymptotically exact data augmentation (AXDA).
February 2019 2 papers accepted at ICASSP 2019.
January 2019 Our paper on split-and-augmented Gibbs sampler (aka ADMM-inspired sampling) has been accepted for publication in IEEE Transactions on Signal Processing.
(starting) Sept., 2018 Data science/analytics consultancy missions for a large retailer.

Research

Preprints

  1. Efficient MCMC sampling with dimension-free convergence rate using ADMM-type splitting
    M. Vono*, D. Paulin*, and A. Doucet
    * equal contribution

    Performing exact Bayesian inference for complex models is intractable. Markov chain Monte Carlo (MCMC) algorithms can provide reliable approximations of the posterior distribution but are computationally expensive for large datasets. A standard approach to mitigate this complexity consists in using subsampling techniques or distributing the data across a cluster. However, these approaches are typically unreliable in high-dimensional scenarios. We focus here on an alternative class of MCMC schemes exploiting a splitting strategy akin to the one used by the celebrated ADMM optimization algorithm. These methods, proposed recently in [43, 51], appear to provide empirically state-of-the-art performance. We generalize here these ideas and propose a detailed theoretical study of one of these algorithms known as the Split Gibbs Sampler. Under regularity conditions, we establish explicit dimension-free convergence rates for this scheme using Ricci curvature and coupling ideas. We demonstrate experimentally the excellent performance of these MCMC schemes on various applications.

            @article{Vono_Paulin_Doucet_2019,
            author = {Vono, Maxime and Paulin, Daniel and Doucet, Arnaud},
            year = {2019},
            title = {Efficient MCMC sampling with dimension-free convergence rate using ADMM-type splitting},
            journal = {arXiv preprint arXiv:1905.11937}
            volume = {},
            number = {},
            pages = {}
            }
          
  2. Asymptotically exact data augmentation: models, properties and algorithms
    M. Vono, N. Dobigeon, and P. Chainais

    Data augmentation, by the introduction of auxiliary variables, has become an ubiquitous technique to improve mixing convergence properties, simplify the implementation or reduce the computational time of inference methods such as Markov chain Monte Carlo. Nonetheless, introducing appropriate auxiliary variables while preserving the initial target probability distribution cannot be conducted in a systematic way but highly depends on the considered problem. To deal with such issues, this paper draws a unified framework, namely asymptotically exact data augmentation (AXDA), which encompasses several well-established but also more recent approximate augmented models. Benefiting from a much more general perspective, it delivers some additional qualitative and quantitative insights concerning these schemes. In particular, general properties of AXDA along with non-asymptotic theoretical results on the approximation that is made are stated. Close connections to existing Bayesian methods (e.g. mixture modeling, robust Bayesian models and approximate Bayesian computation) are also drawn. All the results are illustrated with examples and applied to standard statistical learning problems.

            @article{Vono_AXDA_2019,
            author = {Vono, Maxime and Dobigeon, Nicolas and Chainais, Pierre},
            year = {2019},
            title = {Asymptotically exact data augmentation: models, properties and algorithms},
            journal = {submitted},
            url = {https://arxiv.org/abs/1902.05754},
            volume = {},
            number = {},
            pages = {}
            }
          

Journal papers

  1. Split-and-augmented Gibbs sampler - Application to large-scale inference problems
    M. Vono, N. Dobigeon, and P. Chainais
    IEEE Transactions on Signal Processing, vol. 67, no. 6, pp. 1648-1661, March 2019

    This paper derives two new optimization-driven Monte Carlo algorithms inspired from variable splitting and data augmentation. In particular, the formulation of one of the proposed approaches is closely related to the alternating direction method of multipliers (ADMM) main steps. The proposed framework enables to derive faster and more efficient sampling schemes than the current state-of-the-art methods and can embed the latter. By sampling efficiently the parameter to infer as well as the hyperparameters of the problem, the generated samples can be used to approximate Bayesian estimators of the parameters to infer. Additionally, the proposed approach brings confidence intervals at a low cost contrary to optimization methods. Simulations on two often-studied signal processing problems illustrate the performance of the two proposed samplers. All results are compared to those obtained by recent state-of-the-art optimization and MCMC algorithms used to solve these problems.

            @article{Vono_TSP_2019,
            author = {Vono, Maxime and Dobigeon, Nicolas and Chainais, Pierre},
            year = {2019},
            title = {Split-and-augmented {G}ibbs sampler - {A}pplication to large-scale inference problems},
            journal = {IEEE Transactions on Signal Processing},
            volume = {67},
            number = {6},
            pages = {1648--1661}
            }
          

International conference papers

  1. Bayesian image restoration under Poisson noise and log-concave prior
    M. Vono, N. Dobigeon, and P. Chainais
    IEEE Int. Conf. Acoust., Speech, and Signal Processing (ICASSP), Brighton, U.K., 2019

    In recent years, much research has been devoted to the restoration of Poissonian images using optimization-based methods. On the other hand, the derivation of efficient and general fully Bayesian approaches is still an active area of research and especially if standard regularization functions are used, e.g. the total variation (TV) norm. This paper proposes to use the recent split-and-augmented Gibbs sampler (SPA) to sample efficiently from an approximation of the initial target distribution when log-concave prior distributions are used. SPA embeds proximal Markov chain Monte Carlo (MCMC) algorithms to sample from possibly non-smooth log-concave full conditionals. The benefit of the proposed approach is illustrated on several experiments including different regularizers, intensity levels and with both analysis and synthesis approaches.

            @Inproceedings{Vono_IEEE_ICASSP_2019b,
            author = {Vono, Maxime and Dobigeon, Nicolas and Chainais, Pierre},
            title        = {Bayesian image restoration under {P}oisson noise and log-concave prior},
            booktitle    = {Proc. IEEE Int. Conf. Acoust., Speech, and Signal Processing (ICASSP)},
            address      = {Brighton, U.K.},
            month        = {May},
            year         = {2019},
            pages        = {},
    }
          
  2. Efficient sampling through variable splitting-inspired Bayesian hierarchical models
    M. Vono, N. Dobigeon, and P. Chainais
    IEEE Int. Conf. Acoust., Speech, and Signal Processing (ICASSP), Brighton, U.K., 2019

    Markov chain Monte Carlo (MCMC) methods are an important class of computation techniques to solve Bayesian inference problems. Much recent research has been dedicated to scale these algorithms in high-dimensional settings by relying on powerful optimization tools such as gradient information or proximity operators. In a similar vein, this paper proposes a new Bayesian hierarchical model to solve large scale inference problems by taking inspiration from variable splitting methods. Similarly to the latter, the derived Gibbs sampler permits to divide the initial sampling task into simpler ones. As a result, the proposed Bayesian framework can lead to a faster sampling scheme than state-of-the-art methods by embedding them. The strength of the proposed methodology is illustrated on two often-studied image processing problems.

            @Inproceedings{Vono_IEEE_ICASSP_2019a,
            author = {Vono, Maxime and Dobigeon, Nicolas and Chainais, Pierre},
            title        = {Efficient sampling through variable splitting-inspired {B}ayesian hierarchical models},
            booktitle    = {Proc. IEEE Int. Conf. Acoust., Speech, and Signal Processing (ICASSP)},
            address      = {Brighton, U.K.},
            month        = {May},
            year         = {2019},
            pages        = {},
    }
          
  3. Sparse Bayesian binary logistic regression using the split-and-augmented Gibbs sampler
    M. Vono, N. Dobigeon, and P. Chainais
    IEEE Int. Workshop Machine Learning for Signal Processing (MLSP), Aalborg, Denmark, 2018
    Finalist for the Best Student Paper Awards

    Logistic regression has been extensively used to perform classification in machine learning and signal/image processing. Bayesian formulations of this model with sparsity-inducing priors are particularly relevant when one is interested in drawing credibility intervals with few active coefficients. Along these lines, the derivation of efficient simulation-based methods is still an active research area because of the analytically challenging form of the binomial likelihood. This paper tackles the sparse Bayesian binary logistic regression problem by relying on the recent split-and-augmented Gibbs sampler (SPA). Contrary to usual data augmentation strategies, this Markov chain Monte Carlo (MCMC) algorithm scales in high dimension and divides the initial sampling problem into simpler ones. These sampling steps are then addressed with efficient state-of-the-art methods, namely proximal MCMC algorithms that can benefit from the recent closed-form expression of the proximal operator of the logistic cost function. SPA appears to be faster than efficient proximal MCMC algorithms and presents a reasonable computational cost compared to optimization-based methods with the advantage of producing credibility intervals. Experiments on handwritten digits classification problems illustrate the performances of the proposed approach.

            @inproceedings{Vono_MLSP18,
            author = {Vono, Maxime and Dobigeon, Nicolas and Chainais, Pierre},
            title = {Sparse {B}ayesian binary logistic regression using the split-and-augmented {G}ibbs sampler},
            year = {2018},
            booktitle = {Proc. IEEE Int. Workshop Machine Learning for Signal Processing (MLSP), 2018, Aalborg, Denmark}
            }
          

National conference papers

  1. Modèles augmentés asymptotiquement exacts
    M. Vono, N. Dobigeon, and P. Chainais
    GRETSI, Lille, France, 2019

    L’introduction de variables auxiliaires dans un modèle statistique est communément utilisée afin de simplifier une tâche d’inférence ou augmenter son efficacité. Cependant, l’introduction de ces variables telles que la distribution de probabilité initiale soit préservée relève bien souvent d’un art subtil. Cet article présente un cadre statistique unificateur permettant de lever ces verrous en relâchant l’hypothèse d’augmentation exacte. Ce cadre, appelé asymptotically exact data augmentation (AXDA), regroupe certains modèles de mélange, les modèles bayésiens robustes ou encore ceux construits à partir du splitting de variables. Afin d’illustrer l’intérêt d’une telle approche, un échantillonneur de Gibbs basé sur un modèle AXDA est présenté.

            @Inproceedings{Vono_GRETSI_2019a,
            author = {Vono, Maxime and Dobigeon, Nicolas and Chainais, Pierre},
            title        = {Modèles augmentés asymptotiquement exacts},
            booktitle    = {Proc. GRETSI},
            address      = {Lille, France},
            month        = {August},
            year         = {2019},
            pages        = {},
    }
          
  2. Un modèle augmenté asymptotiquement exact pour la restauration bayésienne d’images dégradées par un bruit de Poisson
    M. Vono, N. Dobigeon, and P. Chainais
    GRETSI, Lille, France, 2019

    De nombreux travaux ont porté sur la restauration d’images dégradées par un bruit de Poisson. Une grande partie des approches proposées reposent sur des algorithmes d’optimisation ou d’approximation variationnelle. Ces derniers sont rapides et efficaces mais ne permettent pas une estimation précise des intervalles de crédibilité sous la loi a posteriori cible. Ce papier présente une méthode de type Monte Carlo par chaînes de Markov (MCMC) permettant de restaurer ces images tout en apportant une mesure contrôlée des incertitudes liées à l’estimation. L’approche proposée repose sur un modèle augmenté asymptotiquement exact et fait intervenir des algorithmes MCMC proximaux pour échantillonner efficacement les lois d’intérêt.

            @Inproceedings{Vono_GRETSI_2019b,
            author = {Vono, Maxime and Dobigeon, Nicolas and Chainais, Pierre},
            title        = {Un modèle augmenté asymptotiquement exact pour la restauration bayésienne d’images dégradées par un bruit de Poisson},
            booktitle    = {Proc. GRETSI},
            address      = {Lille, France},
            month        = {August},
            year         = {2019},
            pages        = {},
    }
          
  3. Invited talks

    1. Asymptotically exact data augmentation: models, algorithms and theory.
      Workshop on optimization, probability and simulation, organized by the Shenzhen Institute of Artificial Intelligence and Robotics for Society (AIRS), Shenzhen (University of Hong Kong), China, December 2019
    2. On artificial intelligence dedicated to supply chain and business strategy
      Mews Digital Day, organized by the management consulting firm Mews Partners, Toulouse, France, April 2019
    3. Split-and-augmented Gibbs sampler - A divide & conquer approach to solve large-scale inference problems
      CRIStAL laboratory seminar organized by the SigMA team, Lille, France, May 2018

    Consulting

    LMlogo ITMlogo

    Starting from September, 2018 I will work, as a data science/analytics consultant, for the Marketing & Strategy direction of an international supermarket chain called Intermarché Alimentaire International (ITM AI). These consultancy missions will be carried out in parallel of my Ph.D. studies and will be the opportunity to apply my previous experiences, knowledge and work to concrete and important issues for retailers, namely sales forecasting, pricing strategy and promotional events.

    Some of the above issues were tackled in my previous internships and Master's thesis (in applied mathematics) where I was particularly interested in optimal pricing policies to apply in clearance and/or promotional events. To this purpose, I did my Master's thesis in partnership with the Pricing direction of a French home-improvement and gardening retailer called Leroy Merlin France (LMF) working on pricing policy optimization for clearance events. The optimal pricing strategy I derived during this work met the requirements and constraints of the Pricing direction of LMF (limited number of price changes, modeling of the buying process uncertainty, etc.), was tested on their past transactional data and was later proposed to some French brick and mortar stores.

    CV [pdf version] (updated on 17.11.19)

    ToulouseUnivlogo LilleUnivlogo CentraleLillelogo

    2017 - 2020 (expected) Ph.D. - Statistics
    University of Toulouse, France

    Optimization-driven Monte Carlo algorithms
    In spring 2019, I was a research visiting student at the Department of Statistics of the University of Oxford under the supervision of Arnaud Doucet.

    2016 - 2017 M.Sc. - Applied Mathematics (Probability & Statistics)
    University of Lille, France

    Obtained with honors

    2013 - 2017 M.Sc. - Engineering (Data Science)
    Ecole Centrale de Lille, France

    Including a gap year
    Head of the class (rank: 1)

    Find me

    On the Web

    Mailing address

    INP - ENSEEIHT Toulouse
    2, rue Charles Camichel
    B.P. 7122
    31071 Toulouse Cedex 7
    France


    CRISTALlogo IRITlogo CNRSlogo Toulouselogo ENSEEIHTlogo

© 2017. All rights reserved.

Powered by a customized version of Hydejack v7.5.1