Philip THOMPSON (ENSAE-ParisTech CREST) – “Stochastic approximation with heavier tails “
October 23, 2:00 pm - 3:15 pm
The Statistical Seminar: Every Monday at 2:00 pm.
Time: 2:00 pm – 3:15 pm
Date: 23th of October 2017
Place: Room 3001.
Philipp THOMPSON (ENSAE-ParisTech CREST) “Stochastic approximation with heavier tails “
Abstract: We consider the solution of convex optimization and variational inequality problems via the stochastic approximation methodology where the gradient or operator can only be accessed through an unbiased stochastic oracle. First, we show that (non-asymptotic) convergence is possible with unbounded constraints and a “multiplicative noise” model: the oracle is Lipschitz continuous with a finite pointwise variance which may not be uniformly bounded (as classically assumed). In this setting, our bounds depend on local variances at solutions and the method uses noise reduction in an efficient manner: given a precision, it respects a near-optimal sample and averaging complexities of Polyak-Ruppert’s method but attains the order of the (faster) deterministic iteration complexity. Second, we discuss a more “robust” version where the Lipschitz constant L is unknown but, in terms of error precision, near-optimal complexities are maintained. A price to pay when L is unknown is that a large sample regime is assumed (still respecting the complexity of the SAA estimator) and “non-martingale-like” dependencies are introduced. These dependencies are coped with an “iterative localization” argument based on empirical process theory and self-normalization.
Joint work with A. Iusem (IMPA), A. Jofré (CMM-Chile) and R.I. Oliveira (IMPA).
Cristina BUTUCEA, Alexandre TSYBAKOV, Eric MOULINES, Mathieu ROSENBAUM