top of page

Information Theory & High-dimensional Statistics
Spring 2021

Lecture 1

information theory history and applications, information measures, entropy

Lecture 2

entropy, convexity, submodularity, divergence

Lecture 3

differential entropy, conditional divergence, mutual information

Lecture 4

mutual information, conditional mutual information, geometric interpretation of mutual information

Lecture 5

variational characterization of divergence, sufficient statistics

Lecture 6

statistical decision theory: basics

Lecture 7

risk functions

Lecture 8

tensor product of experiments, sample complexity

Lecture 9

sample complexity, f-divergence, hypothesis testing, connection between f-divergences

Lecture 10

connection between f-divergences, variational form of f-divergence

Lecture 11

f-divergence, parameter estimation, HCR bound, CR lower bound, fisher information

Lecture 12

Fisher information, multivariate HCR bound

Lecture 13

Bayesian CR lower bound, information bound, local estimators, biased estimators

Lecture 14

maximum likelihood estimator, high-dimensional unstructured estimation, bowl-shaped loss

Lecture 15

two-point quantization of the estimation problem (LeCam’s method)

Lecture 16

two-point per dimension (coordinate) quantization of the estimation problem (Assouad's method)

Lecture 17

information-theoretic method to analyzing risk; model capacity, geometric interpretation

Lecture 18

Shannon's method, Fano's method

Lecture 19

structured high-dimensional estimation, denoising a sparse vector (lower bound)

Lecture 20

denoising a sparse vector (upper bound); thresholding schemes for sparse recovery

Lecture 21

linear regression and sparse recovery

Lecture 22

functional estimation (lower bounds)

Lecture 23

functional estimation (upper bounds)

Please reload

bottom of page