This is the second part of the Lecture: Entropy and Data Compression (II): Shannon's Source Coding Theorem, The Bent Coin Lottery. This lecture is the third lecture of the Course on Information Theory, Pattern Recognition, and Neural Networks and one of the series of sixteen lectures covering the core of the book "Information Theory, Inference, and Learning Algorithms". A subset of these lectures used to constitute a Part III Physics course at the University of Cambridge.

With more than 18,000 students from all walks of life and all corners of the world, nearly 9,000 staff, 31 Colleges and 150 Departments, Faculties, Schools and other institutions, no two days are ever the same at the University of Cambridge.

Suggested Courses

2 min

·

248