Quantum-like Networks: An Approach To Neural Behavior Through Their Mathematics And Logic

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Do brains compute? If they do, what do they compute and how do they do it? The first part of the book introduces the development of a model that simulates actual biological neurons more closely than do current standard models of neural networks, as well as the deduction of its physics-like and computational properties from first principles. The second part presents a collection of applications of the model to memory formation and loss, a general syntax for memory retrieval, language itself, and certain forms of aphasia. A linear development of the discussion with proofs in situ is employed by the author, making the book essentially self-contained. A pair of helpful appendices are provided to acquaint the reader with necessary fundamentals of topics in logic and mathematics. Quantum-like Networks: An Approach to Neural Behavior through their Mathematics and Logic will show you an entirely new approach to an ancient subject.

Author(s): Stephen A. Selesnick
Publisher: World Scientific Publishing
Year: 2022

Language: English
Pages: 353
City: Singapore

Contents
Introduction
About the Author
Notes to the Reader
Part I: Logic and networks
Chapter 1. Logical Foundations
1.1 Irredeemable complexity: Biological systems
1.2 The doctrine of hidden variables
1.2.1 Proximity and orthogonality spaces, ortholattices and the emergence of orthologic
1.2.2 Orthologic and some of its models
1.3 Quantum-like behavior in the modal models
1.3.1 The hallmarks of quantum-like behavior, Born’s Law and the manifestations of superposition
1.4 Quantum-like vs. non-quantum-like behavior in the models
1.4.1 The crucial difference: Greetings from Limbo
1.4.2 Parameter windows
Chapter 2. Neuronal Networks
2.1 Neurons: Structure and function
2.2 Quantum-like networks and their combinations
2.2.1 Standard neural networks
2.2.2 Combinations of networks and their state spaces
2.2.3 The tensor product ⊗
2.2.4 The external direct sum ⊕ and multimodality
2.3 The bicameral neuron
2.3.1 Bicameral networks
2.3.2 Multimodal nodes
2.3.3 CAVEAT! Tensor products of b-network state spaces
2.3.4 Relabeling the nodes of an external direct sum
2.4 Bicameral network dynamics
2.4.1 Derivation of the pseudo-Hamiltonian for a b-network
2.4.2 Firing patterns and eigenstates
2.4.3 A pair of toy b-networks
2.4.4 Symmetry promotes stability
2.5 Remarks on the nature of the substrate connections
2.5.1 The significance of the exterior product
2.5.2 Probing the exterior product connection
2.6 Multimodal b-networks
2.7 Digression: Spin, the Hopfield model and a free-floating tangent
2.7.1 The Jordan-Wigner paradigm
2.7.2 A mathematical subdigression
Chapter 3. The Logic of Many Networks
3.1 Natural deduction
3.1.1 A minimal system
3.2 The Gentzen sequent calculus
3.2.1 Linear Logic and the minimal system GN
3.2.2 Interpretation of the calculus in the category of finite dimensional vector spaces
3.2.3 A model for the ! operator in the category VF,k
3.2.4 The logical interpretation of storage
3.2.5 Modes of connection I
3.3 Timing sequents
3.4 Digression: A translation theorem
Part II: Applications
Chapter 4. Memory-like Processes
4.1 The fragility of connections
4.1.1 The onset of Long Term Potentiation (LTP)
4.1.2 Early LTP
4.1.3 Extrasynaptic connections and Hebb redux
4.1.4 Modes of connection II: White matter tracts
4.2 Memory and retrieval
4.2.1 Context formation and pattern completion
4.2.2 Gestalts I
4.2.3 Retrieval
4.3 Conclusions
Chapter 5. Tsien’s Theory of Connectivity
5.1 The view from GN
5.1.1 Digression: How many black squares?
5.2 The interaction Hamiltonian
5.3 Inside the FCM for a single input
5.4 The case of more than one stimulus
5.4.1 Multimodal b-neurons in the FCMs
5.4.2 Cyclicity in a special case
5.4.3 Digression: Dopaminergic (DA) cells
5.5 The time course in verbal fluency
5.6 Forgetting
5.6.1 Failure of retrieval I: ⊗ may not be implementable
5.6.2 Failure of retrieval II: Neurogenesis
5.6.3 Loss of engrammatic cohesion in short term memory loss
5.6.4 Longer term memory effects
Chapter 6. A General Syntax of Retrieval
6.1 A minimal syntax circuit
6.2 Multimodality, again
6.3 Gestalts II
6.4 Cognitive syntactic atoms and molecules
6.4.1 Syntax for firing patterns and tensors
6.4.2 Examples
6.4.3 General properties of the syntax
6.5 Language
6.5.1 Neuroanatomy is linguistic destiny
6.5.2 Why Only Us
6.5.3 The failure of ⊗ and the language of schizophrenia
6.6 Conclusions
Part III: Appendices
Appendix A: Appendix to Chapter 1
A.1 Some logical and mathematical results
A.1.1 Modal identities
A.1.2 Modal propositions
A.1.3 Ortholattices
A.1.4 Subsets of Rn\{0}
A.1.5 Orthologic: Models, completeness and theModal Embedding Theorem
Appendix B: A Mathematics Primer
B.1 Some multilinear algebra
B.1.1 Tensor products
B.1.2 Exterior products
B.1.3 Exterior algebras
B.1.4 The Plücker Embedding
B.2 A note on convolution
Bibliography
Index