Information Theory and Selected Applications

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This book focuses on analysing the applications of the Shannon Measure of Information (SMI). The book introduces the concept of frustration and discusses the question of the quantification of this concept within information theory (IT), while it also focuses on the interpretation of the entropy of systems of interacting particles in terms of the SMI and of mutual information. The author examines the question of the possibility of measuring the extent of frustration using mutual information and discusses some classical examples of processes of mixing and assimilation for which the entropy changes are interpreted in terms of SMI. A description of a few binding systems and the interpretation of cooperativity phenomena in terms of mutual information are also presented, along with a detailed discussion on the general method of using maximum SMI in order to find the “best-guess” probability distribution. This book is a valuable contribution to the field of information theory and will be of great interest to any scientist who is interested in IT and in its potential applications.

Author(s): Arieh Ben-Naim
Publisher: Springer
Year: 2023

Language: English
Pages: 241
City: Cham

Preface
Contents
Abbreviations
1 Introduction and Caveats
1.1 A Bit of Information About the Bit in Information Theory and the Binary Digit
1.2 Misinterpretation of Probability as SMI and SMI as Probability
1.3 SMI, in General Is not Entropy. Entropy Is a Special Case of SMI
1.4 The “Vennity” of Using Venn Diagrams in Representing Dependence Between Random Variables
1.4.1 The Case of Two Coins with Magnets
1.4.2 The Case of Two Regions on a Board at Which a Dart Is Thrown
1.5 The Frustrating Search for a Measure of Frustration
1.5.1 Three Coins with Magnets
1.5.2 Three Regions on a Board
1.5.3 A Caveat to the Caveat on Frustration
1.6 Levels of Confusion: Information, SMI and Bit
1.7 Information May Be Either Subjective or Objective. The SMI (as Well as Entropy) Is Always an Objective Concept
Appendix 1: Venn Diagram for Pair of Events, and for Pair of Random Variables
Appendix 2: The Monty Hall and the Equivalent Three Prisoners’ Problem
The Three Prisoners’ Problem
Two Solutions to the Three Prisoners’ Problem
A More General, but Easier to Solve, Problem
References
2 Intermolecular Interactions, Correlations, and Mutual Information
2.1 Introduction
2.2 The General Expression for the SMI of Interacting Particles
2.2.1 First Step: The Locational SMI of a Particle in a 1D Box of Length L
2.2.2 Second Step: The Velocity SMI of a Particle in a 1D “Box” of Length L
2.2.3 Third Step: Combining the SMI for the Location and Momentum of a Particle in a 1D System. Addition of Correction Due to Uncertainty
2.2.4 The SMI of One Particle in a Box of Volume V
2.2.5 The Forth Step: The SMI of Locations and Momenta of N Independent Particles in a Box of Volume V. Adding a Correction Due to Indistinguishability of the Particles
2.2.6 The Entropy of a System of Interacting Particles. Correlations Due to Intermolecular Interactions
2.3 The SMI of a System of Interacting Particles in Pairs Only
2.4 Entropy-Change in Phase Transition
2.4.1 Solid–Gas Transition
2.4.2 Liquid–Gas Transition
2.4.3 Solid–Liquid Transition
2.5 Liquid Water
2.6 Aqueous Solutions of Inert Gases
2.7 Entropy and Mutual Information in One Dimensional Fluids
2.7.1 The General Expression for the Entropy of a 1D Fluid
2.7.2 The General Behavior of the Probability Density Pr( r )
2.7.3 The Entropy-Change Due to Turning-On the Interaction Energy
2.7.4 Conclusion
Appendix 1: Solvation Entropy of a Solute as Difference in SMI
References
3 Application of Multivariate Mutual Information to Study Spin Systems
3.1 Definition of Multivariate MI Based on Total Correlations
3.2 Definition of Multivariate MI Based on Conditional Information
3.3 Relationship Between the Conditional MI and the Various SMI
3.4 The Formal Connection Between the TI and CI
3.5 Reinterpretation of the CI in Terms of MIs
3.6 Generalization to Any N Random Variables
3.7 Some Properties of the Multivariate MI
3.8 A Three-Spin System
3.8.1 Probabilities
3.8.2 SMI and Conditional SMI
3.8.3 The Various Mutual Information for Three-Spin Systems
3.8.4 The Three-Spin System with External Field
3.8.5 The Three-Spin System with Different Geometries
3.9 Systems with Four Interacting Spins
3.9.1 Four-Spin Systems; Perfect Square
3.9.2 The Parallelogram Arrangement
3.9.3 The Rectangular Configurations
3.10 Five-Spin Systems
3.10.1 Pentagon with One Additional Interaction
3.10.2 Pentagon with Two Additional Interactions
3.10.3 Pentagon with All Pair Interactions
3.11 Six-Spin Systems
References
4 Entropy of Mixing and Entropy of Assimilation, an Informational Theoretical Approach
4.1 “Entropy of Mixing” of Two Different Ideal Gas
4.2 Entropy of Assimilation
4.3 Is There a Pure De-assimilation Process?
4.4 Racemization as a Pure De-assimilation Process
4.5 An Example of the Entropy Formulation of the Second Law
4.6 An Example of Gibbs Energy-Formulation of the Second Law
4.7 A Baffling Experiments in Systems of Interacting Particles
4.8 Communal SMI and Communal Entropy
References
5 Information Transmission Between Molecules in Binding Systems
5.1 The Method of Generating Probabilities
5.2 Adsorbing on a Single-Site Molecule
5.3 Two-Site Systems
5.3.1 Direct Communication via Ligand-Ligand Interaction
5.3.2 Indirect Communication via Conformation Changes in the Adsorbent Conformation
5.3.3 Indirect Communication Due to Conformational Changes in Each Subunit
5.4 Three-Site Systems
5.4.1 Direct Interaction Between Three Identical Sites
5.4.2 Direct Pair and Triplet Interactions
5.4.3 Direct, But Different Pair Interactions
5.5 Four-Site Systems with Direct Interactions
5.5.1 Multivariate Mutual Information (MI)
5.5.2 A Perfect Square with Equal Nearest Neighbor Interactions
5.5.3 A Perfect Square with Unequal Pair-Interactions
5.5.4 Parallelogram with Five Equal Pair Interactions
5.5.5 Tetrahedral Arrangement; Direct Interaction Between All Pairs of Sites
5.6 Four-Site Systems with Indirect Interactions Only
5.6.1 Binding Isotherms
5.6.2 The Various SMI in the Three Arrangements
5.6.3 Total Mutual Information
References
6 Calculations of the “Best-Guess” Probability Distribution Using Shannon’s Measure of Information
6.1 The Probability Distribution of an Unfair Coin
6.2 A Six-Face Die with Three Different Numbers of Dots; 1, 2 and 3
6.3 Probability Distribution of an Unfair Regular Six-Face Die
6.4 Calculation of an Approximate Micelle-Size Distribution Using Experimental Data
6.5 Computation of the Monomer Concentration
6.6 Calculation of the Moments of the Micelle-Size-Distribution
6.7 Computation of the “Best-Guess” MSD
6.8 Conclusion
References
Index