Syllabus: Introduction to Markov chains (if necessary). Shannons entropy, Gibbs inequality, Typical sequences of random vectors, Shannons theorem. Capacity-cost function and channel coding theorem. Rate distortion function and source coding theorem. Steins lemma and properties of information measures. Uniquely decipherable codes, Codes on trees, Krafts inequality, Krafts code, Huffmans code, Shannon-Fano-Elias code. Parsing codes and trees, Turnstalls code. Universal source coding, Empirical distributions, Kullback-Leibler divergence. Parsing entropy, Lempel-Ziv algorithm, Entropy equivalence. Mutual information and capacity of noisy channels.
ADDITIONAL TOPICS FROM: Stationary coding of finite alphabets, Ergodic theorem for binary alphabets and examples. Frequencies of finite blocks and Entropy theorem.
Reference Texts:
(a) T. M. Cover and J. A. Thomas. Elements of Information Theory.
(b) P. Bremaud. Discrete Probability Models and Methods.
(c) D. J. C. Mackay. Information Theory, Inference and Learning Algorithms.
(d) Robert J. McEliece. The Theory of Infomation and Coding.
(e) Paul C. Shields. The Ergodic Theory of Discrete Sample Paths
https://www.isibang.ac.in/~adean/infsys/database/Bmath/IT.html

- Teacher: Jaikumar Radhakrishnan