
Confidence regions in highdimensional and nonparametric statistical models
Richard Nickl, Cambridge University
In highdimensional and nonparametric statistical models, optimal (adaptive) estimators typically require a model selection, dimension reduction or regularisation step, and as a consequence using them for inference is a nonobvious task. In particular, `uncertainty quantification’  the construction of adaptive `honest' confidence regions that are valid uniformly in the parameter space may not be straightforward or even impossible. We will explain the main ideas of a decisiontheoretic framework (that has emerged in the last 10 years or so) that gives general informationtheoretic conditions which allow to check whether honest confidence sets exist or not in a given statistical model, and, when the answer is negative, which `signal strength’ conditions are required to make adaptive inference. These conditions involve the minimax solution of certain composite highdimensional testing problems, somewhat related to the minimax `signal detection' problem. I will show how the general theory can be applied to several examples, such as sparse or nonparametric regression, density estimation, low rank matrix recovery and matrix completion. We will also describe some concrete uncertainty quantification procedures, Bayesian and nonBayesian, that can be used in such models.
Slides:
Nickl_lectures.pdf

