This page was last updated 1/4/2016 12:37:47
Lecture 2: Definitions of Shannon entropy, relative entropy and their basic properties. Material is found in the books by Cover & Thomas or MacKay
Lecture 3: Mutual information. Paper by Simon Laughlin Z. Naturforsch. (1981) on the contrast-response function in the fly's compound eye. Material is found in the books by Cover & Thomas or MacKay and the original paper.
Lecture 4: Kelly's horse race. Material is found in the book by Cover & Thomas
Lecture 5: Asymptotic Equipartition Property. Material is found in the book by MacKay
Lecture 6: Channel capacity. Material is found in the book by Cover & Thomas or MacKay
Lecture 7: Rate distortion theory. Material is found in the book by Cover & Thomas
Lecture 8: Entropy of spike trains. Material is found in the book by Rieke et al. Spikes
Lecture 9: Estimating entropies from data. Material is found in the book by Bialek Biophysics.
Lecture 10: Entropy of continuous variables and power spectra. Material is found in the books by Cover & Thomas and Bialek
Lecture 11: Gaussian channels. Material is found in the book by Cover & Thomas
Lecture 12: Rate distortion arguments for bacterial growth. Material is found in the book by Bialek
Lecture 13: Codes and compression. Material is found in the book by Cover & Thomas
Lecture 14: Huffman coding and arithmetic coding. Material is found in the book by Cover & Thomas