ACCQ202: Information Theory

1. What is information? How do we store information? Source coding theorem CT 5.1-5.5, 2.1, 2.3, 2.6: info, HW1, HW1sol

2. Huffman coding, entropy, CT 5.6, 2.2: HW2, HW2sol, (old notes1-2)

3. Entropy, mutual information, channel coding theorem, CT 2.3-2.5, 2.10:  HW3, HW3sol   

4. Proof of the converse of the channel coding theorem (7.9, 7.6, 7.7): HW4, HW4sol

5. Proof of the direct part of the
channel coding theorem, Differential entropy, Gaussian channel, joint-source channel coding: HW5, HW5sol

6. Compression beyond entropy (rate-distortion): HW6, HW6sol 

7. Paris (gambling): HW7, HW7sol