NPTEL : NOC:Information Theory (Electrical Engineering)

Co-ordinators : Prof. Himanshu Tyagi


Lecture 1 - What is information?

Lecture 2 - How to model uncertainty?

Lecture 3 - Basic concepts of probability

Lecture 4 - Estimates of random variables

Lecture 5 - Limit theorems

Lecture 6 - Review

Lecture 7 - Source model

Lecture 8 - Motivating examples

Lecture 9 - A compression problem

Lecture 10 - Shannon entropy

Lecture 11 - Random hash

Lecture 12 - Review 2

Lecture 13 - Uncertainty and randomness

Lecture 14 - Total variation distance

Lecture 15 - Generating almost random bits

Lecture 16 - Generating samples from a distribution using uniform randomness

Lecture 17 - Typical sets and entropy

Lecture 18 - Review 3

Lecture 19 - Hypothesis testing and estimation

Lecture 20 - Examples

Lecture 21 - The log-likelihood ratio test

Lecture 22 - Kullback-Leibler divergence and Stein's lemma

Lecture 23 - Properties of KL divergence

Lecture 24 - Review 4

Lecture 25 - Information per coin-toss

Lecture 26 - Multiple hypothesis testing

Lecture 27 - Error analysis of multiple hypothesis testing

Lecture 28 - Mutual information

Lecture 29 - Fano's inequality

Lecture 30 - Measures of information

Lecture 31 - Chain rules

Lecture 32 - Shape of measures of information

Lecture 33 - Data processing inequality

Lecture 34 - Midyear Review

Lecture 35 - Proof of Fano's inequality

Lecture 36 - Variational formulae

Lecture 37 - Capacity as information radius

Lecture 38 - Proof of Pinsker's inequality

Lecture 39 - Continuity of entropy

Lecture 40 - Lower bound for compression

Lecture 41 - Lower bound for hypothesis testing

Lecture 42 - Review 7

Lecture 43 - Lower bound for random number generation

Lecture 44 - Strong converse

Lecture 45 - Lower bound for minmax statistical estimation

Lecture 46 - Variable length source codes

Lecture 47 - Review 8

Lecture 48 - Kraft's inequality

Lecture 49 - Shannon code

Lecture 50 - Huffman code

Lecture 51 - Minmax Redundancy

Lecture 52 - Type based universal compression

Lecture 53 - Review 9

Lecture 54 - Arithmetic code

Lecture 55 - Online probability assignment

Lecture 56 - Compression of databases: A scheme

Lecture 57 - Compression of databases: A lower bound

Lecture 58 - Repetition code

Lecture 59 - Channel capacity

Lecture 60 - Sphere packing bound for BSC

Lecture 61 - Random coding bound for BSC

Lecture 62 - Random coding bound for general channel

Lecture 63 - Review 11

Lecture 64 - Converse proof for channel coding theorem

Lecture 65 - Additive Gaussian Noise channel

Lecture 66 - Mutual information and differential entropy

Lecture 67 - Channel coding theorem for Gaussan channel

Lecture 68 - Parallel channels and water-filling