top of page

Information Theory & Coding (High-dimensional Statistics)
Fall 2018

Title
Topic
Lecture 01

information theory history and applications, information measures, entropy

Lecture 02

entropy, convexity, submodularity, divergence

Lecture 03

differential entropy, conditional divergence, mutual information

Lecture 04

mutual information, conditional mutual information, geometric interpretation of mutual information

Lecture 05

variational characterization of divergence, sufficient statistics

Lecture 06

statistical decision theory: basics

Lecture 07

risk functions

Lecture 08

tensor product of experiments, sample complexity

Lecture 09

sample complexity, f-divergence, hypothesis testing, connection between f-divergences

Lecture 10

connection between f-divergences, variational form of f-divergence

Lecture 11

f-divergence, parameter estimation, HCR bound, CR lower bound, fisher information

Lecture 12

Fisher information, multivariate HCR bound

Lecture 13

Bayesian CR lower bound, information bound, local estimators, biased estimators

Lecture 14

maximum likelihood estimator, high-dimensional unstructured estimation, bowl-shaped loss

Lecture 15

two-point quantization of the estimation problem (LeCam’s method)

Lecture 16

two-point per dimension (coordinate) quantization of the estimation problem (Assouad's method)

Lecture 17

information-theoretic method to analyzing risk; model capacity, geometric interpretation

Lecture 18

Shannon's method, Fano's method

Lecture 19

structured high-dimensional estimation, denoising a sparse vector (lower bound)

Lecture 20

denoising a sparse vector (upper bound); thresholding schemes for sparse recovery

Lecture 21

linear regression and sparse recovery

Lecture 22

functional estimation (lower bounds)

Lecture 23

functional estimation (upper bounds)

bottom of page