top of page

Information Theory & Coding (Machine Learning & Statistics)
Spring 2024

Title
Topic
Lecture 01
information theory history and applications, information measures, entropy
Lecture 02
entropy, convexity, submodularity, divergence
Lecture 03
differential entropy, conditional divergence, mutual information
Lecture 04
mutual information, conditional mutual information
Lecture 05
variational characterization of divergence, sufficient statistics
Lecture 06
variational characterization of divergence, sufficient statistics
Lecture 07
feature selection via information gain, structure learning, density estimation
Lecture 08
information projection, information bottleneck
Lecture 09
source coding, Kraft and McMillan theorems, Huffman codes, prefix codes
Lecture 10
maximum description length principle, rate-distortion theory
Lecture 11
empirical risk minimization, histogram classifiers, decision trees
Lecture 12
histogram regression, universal prediction, unbounded loss functions
Lecture 13
statistical decision theory: basics
Lecture 14
risk functions
Lecture 15
tensor product of experiments, sample complexity
Lecture 16
sample complexity, f-divergence, hypothesis testing, connection between f-divergences
Lecture 17
connection between f-divergences, variational form of f-divergence
Lecture 18
f-divergence, parameter estimation, HCR bound, CR lower bound, fisher information
Lecture 19
Fisher information, multivariate HCR bound
Lecture 20
Bayesian CR lower bound, information bound, local estimators, biased estimators
Lecture 21
maximum likelihood estimator, high-dimensional unstructured estimation, bowl-shaped loss
Lecture 22
two-point quantization of the estimation problem (LeCam’s method)
Lecture 23
mutual information method, Fano's method, density estimation
bottom of page