Please note:

To view the current Academic Calendar, go to

Information Theory ENSC 808 (3)

Information measures: entropy, relative entropy, mutual information, entropy rate, differential entropy. Asymptotic Equipartition Property. Lossless data compression: Kraft inequality, Huffman code, Shannon code, Arithmetic coding. Channel capacity: binary symmetric channel, binary erasure channel, Shannon's channel coding theorem, Gaussian channel, feedback. Prerequisite: STAT 270 or equivalent.

Section Instructor Day/Time Location
G100 Ivan Bajic
Tu, Th 12:30 PM – 1:50 PM
RCB 7102, Burnaby