Loading Events
  • This event has passed.

Estimation of Functionals of High-Dimensional Parameters: Bias Reduction and Concentration, Vladimir Koltchinskii (Georgia Institute of Technology)

May 6 @ 1:00 pm - May 23 @ 4:15 pm | Organizer: Alexandre Tsybakov

 

 

 

SCHEDULE

 

Monday

 

6th May 2024

13th May 2024

 

From 13:00 to 16:15

 

Room 2033

 

Thursday

 

16th May 2024

23rd May 2024

 

From 13:00 to 16:15

 

Room 2033

Aims and objectives

The aim of this course will be on a circle of problems related to estimation of real valued functionals of parameters of high-dimensional statistical models. In such problems, it is of interest to estimate onedimensional features of a high-dimensional parameter that are often represented by nonlinear functionals of certain degree of smoothness defined on the parameter space. The functionals of interest could be estimated with faster convergence rates than the whole parameter (sometimes, even with parametric rates). The examples include, for instance, such problems as estimation of linear functionals of principal components (that are nonlinear functionals of unknown covariance) in high-dimensional PCA. The goal is to discuss several mathematical methods that provide a way to develop estimators of functionals of highdimensional parameters with optimal error rates in classes of functionals of some Hölder smoothness.
Moreover, when the degree of smoothness of the functional is above certain threshold, the estimators in question have parametric √𝑛 error rate and are asymptotically efficient, whereas the error rates become slower than √𝑛 when the degree of smoothness is below the threshold.
The following topics will be covered (at least, to some extent):
• preliminaries in high-dimensional probability and analysis (concentration inequalities, comparison inequalities, Hölder smoothness of operator functions, etc);
• non-asymptotic bounds and concentration inequalities for sample covariance in high-dimensional and dimension-free frameworks;
• some approaches to concentration inequalities for smooth functionals of statistical estimators;
• higher order bias reduction methods in functional estimation;
– methods based on Taylor expansion and estimation of polynomials with reduced bias;
– iterative bias reduction and bootstrap chains;
– linear aggregation of plug-in estimators with different sample sizes and jackknife estimators;
• minimax lower bounds in functional estimation (applications of van Trees inequality, Nemirovski’s construction of least favorable functionals, etc);
• Examples:
– high-dimensional and infinite dimensional Gaussian models: functionals of mean and of covariance;
– log-concave models, in particular, log-concave location families;
– high-dimensional exponential families;
– nonparametric models, functionals of unknown density;
– linear functionals of spectral projections of matrix parameters.