Information Theory: Three Theorems by Claude Shannon

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This book provides an introduction to information theory, focussing on Shannon’s three foundational theorems of 1948–1949. Shannon’s first two theorems, based on the notion of entropy in probability theory, specify the extent to which a message can be compressed for fast transmission and how to erase errors associated with poor transmission. The third theorem, using Fourier theory, ensures that a signal can be reconstructed from a sufficiently fine sampling of it. These three theorems constitute the roadmap of the book. 

The first chapter studies the entropyof a discrete random variable and related notions. The second chapter, on compression and error correcting, introduces the concept of coding, proves the existence of optimal codes and good codes (Shannon's first theorem), and shows how information can be transmitted in the presence of noise (Shannon's second theorem). The third chapter proves the sampling theorem (Shannon's third theorem) and looks at its connections with other results, such as the Poisson summation formula. Finally, there is a discussion of the uncertainty principle in information theory.

Featuring a good supply of exercises (with solutions), and an introductory chapter covering the prerequisites, this text stems out lectures given to mathematics/computer science students at the beginning graduate level.


Author(s): Antoine Chambert-Loir
Series: UNITEXT, 144
Edition: 1
Publisher: Springer
Year: 2023

Language: English
Pages: 221
City: Cham
Tags: Information Entropy; Coding Sampling Theory; Random Variables; Fourier Series; Fourier Transform

Preface
Contents
Chapter 0. Some bits of probability theory
0.1. Summable families
0.2. Probability theory
0.3. Discrete random variables
0.4. Independence, conditional expectation
Exercises
Chapter 1. Entropy and mutual information
1.1. Entropy of a discrete random variable
1.2. Conditional entropy
1.3. Mutual information
1.4. Entropy rate
1.5. Entropy rate of Markov processes
Exercises
Chapter 2. Coding
2.1. Codes
2.2. The Kraft–McMillan inequality
2.3. Optimal codes
2.4. The law of large numbers, and compression
2.5. Transmission capacity of a channel
2.6. Coding adapted to a transmission channel
Exercises
Chapter 3. Sampling
3.1. Continuous signals and discrete signals
3.2. The Fourier series of a periodic function
3.3. The main theorems of the theory of Fourier series
3.4. Convolution and Dirichlet’s theorem
3.5. Fourier transformation
3.6. The sampling theorem
3.7. The uncertainty principle in communication theory
Exercises
Notation
Chapter 4. Solutions of the exercises
4.0. Some bits of probability theory
4.1. Entropy and mutual information
4.2. Coding
4.3. Sampling
References
Index