Supplemental reading material and lecture notes will be posted along with each week’s lecture recording. Note that the first two weeks consist of lectures in the work sessions as well, and so naturally cover more material. The last two weeks will involve pedagogical material as well as selected applications.
- Week 1: Overview + Intro to information theory, probability and statistical inference — Information and disorder, entropy in statistical mechanics and its relation to entropy in information theory, the Kullback-Leibler divergence. Parameter estimation/ model selection with finite sample size, statistical moments, covariance, conditional probabilities and conditional entropy, likelihood functions, Fisher information matrices. Bayes theorem, applications of Bayesian reasoning, Bayesian model selection.
- Week 2: Differential geometry + information geometry I — Differentiable manifolds, coordinate charts, vectors and derivations, tangent and co-tangent spaces. Tensors as multi-linear maps, the metric tensor, geodesics, curvature, statistical manifolds, Fisher information and the information metric, relation to the KL divergence, biased and unbiased estimators. The Cramér-Rao bound, sufficient statistics, uniqueness of the information metric, exponential families, information geometry and statistical inference.
- Week 3: Information geometry II — Dual connections, dually flat manifolds, sub-manifolds. Applications: thermodynamic distance, phase transitions and thermodynamic hysteresis.
- Week 4: Information geometry III — Embedded exponential families, parametric and non-parametric statistical inference. Applications: information geometry of quantum systems.