ECL521 Information Theory and Coding [(3-0-0); Credit: 3]
1. Analyze self and mutual information.
2. Evaluate the information rate of various information sources.
3. Design lossless data compression codes for discrete memory-less sources.
4. Evaluate the information capacity of discrete memory-less channels and determine possible code rates achievable on such channels.
5. Design simple linear block error correcting codes, select and design simple convolutional codes.
Communication processes, Channel matrix, Probability relation in a channel, the measure of information, Entropy function – Properties of entropy function,n Mutual Information, Symmetry of information, Jensen's Inequality, Fano’s Inequality.
Channel capacity; Special types of channels and their capacity, Noiseless channels symmetric channel, erasure channels, continuous channels, Shannon’s theorem, Shannon Hartley theorem for AWGN channels.
Encoding: Block code, Binary code, Binary Huffman code, Shannon–Fano Encoding procedure, Noiseless coding theorem. Error – correcting codes. Examples of codes, Hadamard matrices and codes, Binary Colay code, Matrix description of linear codes, Equivalence of linear codes, The Hamming codes, The standard array, Syndrome decoding.
Introduction to Rate Distortion Theory, MIMO Information Theory: Concept of diversity, introduction to MIMO systems, space-time coding, MIMO Channels, capacity of MIMO channels, ergodic capacity.
Recommended Books and Links to supplementary material Click here
Assignments Click here
Previous Year Question Papers are available in Department's Library.