This is the second part of the Lecture: Entropy and Data Compression (I): Introduction to Compression, Inf.Theory and Entropy. This is the Lecture 2 of the Course on Information Theory, Pattern Recognition, and Neural Networks. It is the second of the series of sixteen lectures covering the core of the book "Information Theory, Inference, and Learning Algorithms". A subset of these lectures used to constitute a Part III Physics course at the University of Cambridge. To find out more about Entropy and Data Compression look up for the third part of this course.

With more than 18,000 students from all walks of life and all corners of the world, nearly 9,000 staff, 31 Colleges and 150 Departments, Faculties, Schools and other institutions, no two days are ever the same at the University of Cambridge.