Novelty, Information and Surprise

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This revised edition offers an approach to information theory that is more general than the classical approach of Shannon. Classically, information is defined for an alphabet of symbols or for a set of mutually exclusive propositions (a partition of the probability space Ω) with corresponding probabilities adding up to 1. The new definition is given for an arbitrary cover of Ω, i.e. for a set of possibly overlapping propositions. The generalized information concept is called novelty and it is accompanied by two concepts derived from it, designated as information and surprise, which describe "opposite" versions of novelty, information being related more to classical information theory and surprise being related more to the classical concept of statistical significance. In the discussion of these three concepts and their interrelations several properties or classes of covers are defined, which turn out to be lattices. The book also presents applications of these concepts, mostly in statistics and in neuroscience.


Author(s): Günther Palm
Series: Information Science and Statistics
Edition: 2
Publisher: Springer
Year: 2023

Language: English
Pages: 293
City: Berlin

Personal History of the Book
References
Preface to the Second Edition
Contents
List of Symbols
List of Figures
1 Introduction
Organization of the Book
Philosophy of the Book
References
Part I Surprise and Information of Descriptions
2 Prerequisites from Logic and Probability Theory
2.1 Logic and Probability of Propositions
2.2 Mappings, Functions, and Random Variables
2.3 Measurability, Random Variables, and Expectation Value
2.4 Technical Comments
References
3 Improbability and Novelty of Descriptions
3.1 Introductory Examples
3.2 Definition and Properties
3.3 Descriptions
3.4 Properties of Descriptions
3.5 Information and Surprise of Descriptions
3.6 Information and Surprise of a Random Variable
3.7 Technical Comments
3.8 Exercises
References
4 Conditional and Subjective Novelty and Information
4.1 Introductory Examples
4.2 Subjective Novelty
4.3 Conditional Novelty
4.4 Information Theory for Random Variables
4.5 Technical Comments
4.6 Exercises
References
Part II Coding and Information Transmission
5 On Guessing and Coding
5.1 Introductory Examples
5.2 Guessing Strategies
5.3 Codes and Their Relation to Guessing Strategies
5.4 Kraft's Theorem
5.5 Huffman Codes
5.6 Relation Between Codeword Length and Information
5.7 Technical Comments
5.8 Exercises
References
6 Information Transmission
6.1 Introductory Examples
6.2 Transition Probability
6.3 Transmission of Information Across Simple Channels
6.4 Technical Comments
6.5 Exercises
References
Part III Information Rate and Channel Capacity
7 Stationary Processes and Their Information Rate
7.1 Introductory Examples
7.2 Definition and Properties of Stochastic Processes
7.3 The Weak Law of Large Numbers
7.4 Information Rate of Stationary Processes
7.5 Transinformation Rate
7.6 Asymptotic Equipartition Property
7.7 Technical Comments
7.8 Exercises
References
8 Channel Capacity
8.1 Information Channels
8.2 Memory and Anticipation
8.3 Channel Capacity
8.4 Technical Comments
8.5 Exercises
References
9 How to Transmit Information Reliably with Unreliable Elements (Shannon's Theorem)
9.1 The Problem of Adapting a Source to a Channel
9.2 Shannon's Theorem
9.3 Technical Comments
9.4 Exercises
References
Part IV Repertoires and Covers
10 Repertoires and Descriptions
10.1 Introductory Examples
10.2 Repertoires and Their Relation to Descriptions
10.3 Tight Repertoires
10.4 Narrow and Shallow Covers
10.5 Technical Comments
10.6 Exercises
References
11 Novelty, Information, and Surprise of Repertoires
11.1 Introductory Examples
11.2 Definitions and Properties
11.3 Finding Descriptions with Minimal Information
11.4 Technical Comments
11.5 Exercises
References
12 Conditioning, Mutual Information, and Information Gain
12.1 Introductory Examples
12.2 Conditional Information and Novelty
12.3 Mutual Novelty and Transinformation
12.4 Information Gain, Novelty Gain, and Surprise Loss
12.5 Conditional Information of Continuous Random Variables
12.6 Technical Comments
12.7 Applications in Pattern Recognition, Machine Learning, and Life Science
12.8 Exercises
References
Part V Information, Novelty and Surprise in Science
13 Information, Novelty, and Surprise in Brain Theory
13.1 Understanding Brains in Terms of Processing and Transmission of Information
13.2 Neural Repertoires
13.3 Experimental Repertoires in Neuroscience
13.3.1 The Burst Repertoire
13.3.2 The Pause Repertoire
13.3.3 The Coincidence Repertoire
13.3.4 The Depolarization Repertoire
13.4 Neural Population Repertoires: Semantics and Syntax
13.5 Conclusion
13.6 Technical Comments
13.6.1 Coincidence
13.6.2 Coincidental Patterns
13.6.3 Spatiotemporal Patterns
References
14 Surprise from Repetitions and Combination of Surprises
14.1 Combination of Surprises
14.2 Surprise of Repetitions
14.3 Surprise of Repetitions of Patterns
14.4 Technical Comments
References
15 Entropy in Physics
15.1 Classical Entropy
15.2 Modern Entropies and the Second Law
15.3 The Second Law in Terms of Information Gain
15.4 Technical Comments
References
Part VI Generalized Information Theory
16 Order- and Lattice-Structures
16.1 Definitions and Properties
16.2 The Lattice D of Descriptions
16.3 Technical Comments
Reference
17 Three Orderings on Repertoires
17.1 Definition and Basic Properties
17.2 Equivalence Relations Defined by the Orderings
17.3 The Joins and Meets for the Orderings
17.4 The Orderings on Templates and Flat Covers
17.5 Technical Comments
17.6 Exercises
References
18 Information Theory on Lattices of Covers
18.1 The Lattice C of Covers
18.2 The Lattice Ff of Finite Flat Covers
18.3 The Lattice R of (Clean) Repertoires
18.4 The Lattice T of Templates
18.5 The Lattice P of Partitions
18.6 Technical Comments
18.7 Exercises
References
A Fuzzy Repertoires and Descriptions
A.1 Basic Definitions
A.2 Definition and Properties of Fuzzy Repertoires
Reference
B Similarity Theory
B.1 Definitions and Elementary Observations
B.2 Homomorphisms Between Weak-Metric Spaces
References
Index