## Information Theory & High-dimensional Statistics

## Fall 2018

__Lecture 1__

information theory history and applications, information measures, entropy

__Lecture 2__

entropy, convexity, submodularity, divergence

__Lecture 3__

differential entropy, conditional divergence, mutual information

__Lecture 4__

mutual information, conditional mutual information, geometric interpretation of mutual information

__Lecture 5__

variational characterization of divergence, sufficient statistics

__Lecture 6__

statistical decision theory: basics

__Lecture 7__

risk functions

__Lecture 8__

tensor product of experiments, sample complexity

__Lecture 9__

sample complexity, f-divergence, hypothesis testing, connection between f-divergences

__Lecture 10__

connection between f-divergences, variational form of f-divergence

__Lecture 11__

f-divergence, parameter estimation, HCR bound, CR lower bound, fisher information

__Lecture 12__

Fisher information, multivariate HCR bound

__Lecture 13__

Bayesian CR lower bound, information bound, local estimators, biased estimators

__Lecture 14__

maximum likelihood estimator, high-dimensional unstructured estimation, bowl-shaped loss

__Lecture 15__

two-point quantization of the estimation problem (LeCam’s method)

__Lecture 16__

two-point per dimension (coordinate) quantization of the estimation problem (Assouad's method)

__Lecture 17__

information-theoretic method to analyzing risk; model capacity, geometric interpretation

__Lecture 18__

Shannon's method, Fano's method

__Lecture 19__

structured high-dimensional estimation, denoising a sparse vector (lower bound)

__Lecture 20__

denoising a sparse vector (upper bound); thresholding schemes for sparse recovery

__Lecture 21__

linear regression and sparse recovery

__Lecture 22__

functional estimation (lower bounds)

__Lecture 23__

functional estimation (upper bounds)