top of page
ACCQ202: Information Theory
1. What is information? How do we store information? Source coding theorem, CT 5-5.5, 2.1, complement to prefix-free codes and Pacôme/Mathys complement to uniquely decodable codes: info, HW1, HW1sol
2. Entropy bound, Shannon codes, block coding, Huffman codes, guessing, CT 5.6-5.7: HW2, HW2sol, (old notes1-2)
3. Entropy, mutual information, typicality, CT 2.1-2.6: HW3, HW3sol
4. Joint typicality, information transmission, information capacity, operational capacity, CT 7-7.6: HW4, HW4sol
5. Channel coding theorem, proof of the direct part of the channel coding theorem, zero-error capacity. CT 7.7, 7.9, 9.1: HW5, HW5sol, Shannon's zero-error paper
6. Converse channel coding theorem, DPI, Fano inequality, joint-source channel coding, compression beyond entropy (rate-distortion), 7.9, 9.2, : HW6, HW6sol
7. Gambling (beautiful minds at play), Samuelson's paper, Thorp and Shannon (and Kelly): old exam as HW7
bottom of page

