Theory Of Information And Its Value

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This English version of Ruslan L. Stratonovich’s Theory of Information (1975) builds on theory and provides methods, techniques, and concepts toward utilizing critical applications. Unifying theories of information, optimization, and statistical physics, the value of information theory has gained recognition in data science, machine learning, and artificial intelligence. With the emergence of a data-driven economy, progress in machine learning, artificial intelligence algorithms, and increased computational resources, the need for comprehending information is essential. This book is even more relevant today than when it was first published in 1975. It extends the classic work of R.L. Stratonovich, one of the original developers of the symmetrized version of stochastic calculus and filtering theory, to name just two topics. Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.

Author(s): Ruslan L. Stratonovich [Roman V. Belavkin, Panos M. Pardalos, Jose C. Principe (Editors)]
Publisher: Springer
Year: 2020 [1975]

Language: English
Pages: 431
Tags: Information And Communication, Circuits

Foreword......Page 5
Preface......Page 12
Introduction......Page 14
Contents......Page 18
1 Definition of information and entropy in the absence of noise......Page 22
1.1 Definition of entropy in the case of equiprobable outcomes......Page 24
1.2 Entropy and its properties in the case of non-equiprobable outcomes......Page 26
1.3 Conditional entropy. Hierarchical additivity......Page 29
1.4 Asymptotic equivalence of non-equiprobable and equiprobable outcomes......Page 33
1.5 Asymptotic equiprobability and entropic stability......Page 37
1.6 Definition of entropy of a continuous random variable......Page 43
1.7 Properties of entropy in the generalized version. Conditional entropy......Page 49
2 Encoding of discrete information in the absence of noise and penalties......Page 55
2.1 Main principles of encoding discrete information......Page 56
2.2 Main theorems for encoding without noise. Independent identically distributed messages......Page 60
2.3 Optimal encoding by Huffman. Examples......Page 64
2.4 Errors of encoding without noise in the case of a finite code sequence length......Page 68
3 Encoding in the presence of penalties. First variational problem......Page 72
3.1 Direct method of computing information capacity of a message for one example......Page 73
3.2 Discrete channel without noise and its capacity......Page 75
3.3 Solution of the first variational problem. Thermodynamic parameters and potentials......Page 77
3.4 Examples of application of general methods for computation of channel capacity......Page 84
3.5 Methods of potentials in the case of a large number of parameters......Page 89
3.6 Capacity of a noiseless channel with penalties in a generalized version......Page 93
4 First asymptotic theorem and related results......Page 95
4.1 Potential Gamma or the cumulant generating function......Page 96
4.2 Some asymptotic results of statistical thermodynamics. Stability of the canonical distribution......Page 100
4.3 Asymptotic equivalence of two types of constraints......Page 107
4.4 Some theorems about the characteristic potential......Page 112
5 Computation of entropy for special cases. Entropy of stochastic processes......Page 120
5.1 Entropy of a segment of a stationary discrete process and entropy rate......Page 121
5.2 Entropy of a Markov chain......Page 124
5.3 Entropy rate of part of the components of a discrete Markov process and of a conditional Markov process......Page 130
5.4 Entropy of Gaussian random variables......Page 140
5.5 Entropy of a stationary sequence. Gaussian sequence......Page 145
5.6 Entropy of stochastic processes in continuous time. General concepts and relations......Page 151
5.7 Entropy of a Gaussian process in continuous time......Page 154
5.8 Entropy of a stochastic point process......Page 161
5.9 Entropy of a discrete Markov process in continuous time......Page 170
5.10 Entropy of diffusion Markov processes......Page 174
5.11 Entropy of a composite Markov process, a conditional process, and some components of a Markov process......Page 178
6.1 Information losses under degenerate transformations and simple noise......Page 189
6.2 Mutual information for discrete random variables......Page 194
6.3 Conditional mutual information. Hierarchical additivity of information......Page 197
6.4 Mutual information in the general case......Page 203
6.5 Mutual information for Gaussian variables......Page 205
6.6 Information rate of stationary and stationary-connected processes. Gaussian processes......Page 212
6.7 Mutual information of components of a Markov process......Page 218
7 Message transmission in the presence of noise. Second asymptotic theorem and its various formulations......Page 232
7.1 Principles of information transmission and information reception in the presence of noise......Page 233
7.2 Random code and the mean probability of error......Page 236
7.3 Asymptotic zero probability of decoding error. Shannon's theorem (second asymptotic theorem)......Page 240
7.4 Asymptotic formula for the probability of error......Page 243
7.5 Enhanced estimators for optimal decoding......Page 247
7.6 Some general relations between entropies and mutual informations for encoding and decoding......Page 258
8.1 Definition of channel capacity......Page 263
8.2 Solution of the second variational problem. Relations for channel capacity and potential......Page 266
8.3 The type of optimal distribution and the partition function......Page 273
8.4 Symmetric channels......Page 276
8.5 Binary channels......Page 278
8.6 Gaussian channels......Page 281
8.7 Stationary Gaussian channels......Page 291
8.8 Additive channels......Page 298
9 Definition of the value of information......Page 303
9.1 Reduction of average cost under uncertainty reduction......Page 304
9.2 Value of Hartley's information amount. An example......Page 308
9.3 Definition of the value of Shannon's information amount and α-information......Page 314
9.4 Solution of the third variational problem. The corresponding potentials.......Page 318
9.5 Solution of a variational problem under several additional assumptions.......Page 327
9.6 Value of Boltzmann's information amount......Page 332
9.7 Another approach to defining the value of Shannon's information......Page 335
10.1 Two-state system......Page 340
10.2 Systems with translation invariant cost function......Page 344
10.3 Gaussian Bayesian systems......Page 351
10.4 Stationary Gaussian systems......Page 359
11 Asymptotic results about the value of information. Third asymptotic theorem......Page 366
11.1 On the distinction between the value functions of different types of information. Preliminary forms......Page 367
11.2 Theorem about asymptotic equivalence of the value functions of different types of information......Page 370
11.3 Rate of convergence between the values of Shannon's and Hartley's information......Page 382
11.4 Alternative forms of the main result. Generalizations and special cases......Page 392
11.5 Generalized Shannon's theorem......Page 398
12 Information theory and the second law of thermodynamics......Page 404
12.1 Information about a physical system being in thermodynamic equilibrium. The generalized second law of thermodynamics......Page 405
12.2 Influx of Shannon's information and transformation of heat into work......Page 408
12.3 Energy costs of creating and recording information. An example......Page 412
12.4 Energy costs of creating and recording information. General formulation......Page 416
12.5 Energy costs in physical channels......Page 418
A.1 Rules for operator transfer from left to right......Page 422
A.2 Determinant of a block matrix......Page 423
References......Page 425
Index......Page 429