Information Theory

Lecturer (assistant)
Number0000001589
Type
Duration5 SWS
TermWintersemester 2024/25
Language of instructionEnglish
Position within curriculaSee TUMonline
DatesSee TUMonline

Admission information

Description

Review of probability theory. Information theory for discrete and continuous variables: entropy, informational divergence, mutual information, inequalities. Coding of memoryless sources: rooted trees with probabilities, Kraft inequality, entropy bounds on source coding, Huffman codes, Tunstall codes. Coding of stationary sources: entropy bounds, Elias code for the positive integers, Elias-Willems universal source coding, hidden finite-memory sources. Channel coding: memoryless channels, block and bit error probability, random coding, converse, binary symmetric channel, binary erasure channel, symmetric channels, real and complex AWGN channels, parallel and vector AWGN channels, source and channel coding.

Links