The field of mathematical psychology began in the 1950s and includes both psychological theorizing, in which mathematics plays a key role, and applied mathematics motivated by substantive problems in psychology. Central to its success was the publication of the first Handbook of Mathematical Psychology in the 1960s. The psychological sciences have since expanded to include new areas of research, and significant advances have been made in both traditional psychological domains and in the applications of the computational sciences to psychology. Upholding the rigor of the original Handbook, the New Handbook of Mathematical Psychology reflects the current state of the field by exploring the mathematical and computational foundations of new developments over the last half-century. The second volume focuses on areas of mathematics that are used in constructing models of cognitive phenomena and decision making, and on the role of measurement in psychology.
Author(s): William H. Batchelder, Hans Colonius, Carl V. Ehtibar N. Dzhafarov
Series: Cambridge Handbooks In psychology
Publisher: Cambridge University Press
Year: 2018
Language: English
Pages: 478
Tags: Mathematics, Psychology Research Methods And Statistics, Mathematical Modeling And Methods, Psychology, Psychology: General Interest, Psychology: Mathematical Models, Psychometrics
Frontmatter......Page 2
Contents......Page 5
List of Contributors......Page 6
List of Abbreviations......Page 7
Preface......Page 10
1 Stochastic Methods for Modeling Decision-making......Page 16
1.1 Introduction......Page 17
1.2 Probabilistic Modeling in Decision Theory......Page 18
1.2.1 Information Accrual......Page 19
1.2.2 Random Walk Models – An Example......Page 20
1.3 Markov Chains......Page 23
1.4 Markov Property......Page 24
1.5 Examples of Markov Chains with a Finite State Space......Page 26
1.6 Transition Probabilities in Several Steps......Page 29
1.7 Distribution of Markov Chains: Computational Aspects......Page 30
1.8 Markovianization: Markov Models with Bounded Memory......Page 32
1.9 Stopping Times......Page 33
1.10 Absorbing Markov Chains......Page 35
1.11 Hitting Probabilities......Page 38
1.11.1 More General Random Walk on Integers......Page 39
1.12 Simple Random Walk with Absorbing Boundaries......Page 41
1.13.1 Simple Random Walk on ZZ2......Page 43
1.14 Random Walk on Trees and its Applications......Page 45
1.14.1 Applications of the Random Walk on the Tree......Page 47
1.15 Continuous-time Markov Chains......Page 48
1.16 Random Walks and the Wiener Process......Page 50
1.17 The Distribution of the Wiener Process......Page 53
1.17.1 The Wiener Process with Drift: Two Approaches......Page 54
1.17.2 Functional Relation between the Parameters......Page 56
1.18 Computation with the Wiener Process......Page 57
1.18.1 Simulation of the Wiener Process......Page 60
1.20 Gaussian Vectors......Page 61
1.20.2 Invariance of Gaussian Vectors under Affine Transformations......Page 62
1.21 The Bessel Process......Page 63
1.22 The Ornstein–Uhlenbeck Process......Page 64
1.22.1 The Ehrenfest Urn Model......Page 65
1.22.2 Construction of the Ornstein–Uhlenbeck Process......Page 66
1.22.3 The Ornstein–Uhlenbeck Process: From Discrete to Continuous......Page 68
1.23 Martingales and Their Applications......Page 69
1.23.1 Martingales and Stopping Times......Page 71
1.24 Decision-making with Multiple Attributes......Page 73
1.24.1 Time and Order Schedules......Page 74
1.24.2 Implementation of Random Schedules......Page 80
References......Page 84
2.1 Binary Decision Tasks......Page 86
2.2 Signal Detection Model......Page 87
2.3 Random Walk Model......Page 90
2.4 Continuous-time Model......Page 95
2.5 Bayesian Diffusion Model......Page 98
2.6 Translation between Diffusion Models......Page 102
2.7 Predictions......Page 107
2.8 Intertrial Variability and Unfalsifiability......Page 112
2.9 Further Reading......Page 115
2.10 Conclusions......Page 116
References......Page 117
3.1 Introduction......Page 119
3.2 General Event Spaces: Formal Probability Theory......Page 124
3.3 Establishing Serial and Parallel Distinctions through the σ-Spaces......Page 127
3.5 Experimental Identifiability of Architectural Distinctions of Causal Mental Systems......Page 131
3.5.1 Distinguishing Parallel and Serial Systems with Selective Influence Manipulations......Page 135
3.6 Discussion and Conclusions......Page 140
References......Page 141
4 Identifiability of Probabilistic Models, with Examples from Knowledge Structure Theory......Page 143
4.1 An Example and Our Goals in the Chapter......Page 144
4.2 Probabilistic Models......Page 147
4.3.1 Convex Polytopes......Page 155
4.3.2 Two Theorems (and More Propositions) from Mathematical Analysis......Page 156
4.3.3 Investigating Probabilistic Models......Page 160
4.3.5 Results about Model Identifiability......Page 161
4.3.6 Summary......Page 162
4.4 Knowledge Structures: Combinatorial Aspects......Page 163
4.5 Probabilistic Knowledge Structures......Page 167
4.6 The Correct Response Model: Testability......Page 168
4.7 The Correct Response Model: Identifiability......Page 173
4.8 The Basic Local Independence Model......Page 177
4.8.1 Parameter Domain......Page 178
4.8.3 Prediction Function......Page 179
4.9 The Basic Local Independence Model: Testability......Page 180
4.10 The Basic Local Independence Model: Identifiability......Page 181
4.10.1 Trade-off Dimensions......Page 188
4.11 Latent Class Models......Page 191
4.12 Conclusions......Page 195
4.13 Bibliographical Notes......Page 196
References......Page 197
5 Quantum Models of Cognition and Decision......Page 200
5.2 Mathematical Background......Page 201
5.2.1 Hilbert Space......Page 202
5.2.2 Linear Operators and Subspaces......Page 204
5.2.3 Basis Vectors and Unitary Operators......Page 206
5.2.4 Eigenvectors and Eigenvalues of Matrices......Page 209
5.2.5 Tensor Product Spaces......Page 210
5.3.1 Events......Page 211
5.3.2 System State......Page 212
5.3.4 Commutative Events......Page 213
5.3.5 Non-commutative Events......Page 214
5.3.6 Observables and the Uncertainty Relation......Page 216
5.3.7 Generalized Measurements and Density Operators......Page 217
5.3.9 Conjunction Fallacy......Page 219
5.3.10 Question Order Effects......Page 220
5.3.11 Disjunction Effect......Page 222
5.4.1 System State......Page 223
5.4.2 Transitions between States......Page 224
5.4.3 Measurements......Page 226
5.4.4 Generalized Measurements and Density Operators......Page 227
5.4.5 Psychological Applications......Page 228
5.5 Concluding Comments......Page 232
5.5.2 Extensions and Alternatives to Quantum Probability......Page 233
References......Page 234
6 Computational Cognitive Neuroscience......Page 238
6.1.1 A Brief History......Page 239
6.2 Advantages of CCN Modeling......Page 241
6.2.2 Model Inflexibility......Page 242
6.2.4 Ability to Unite Seemingly Disparate Fields......Page 243
6.3 CCN Modeling Principles......Page 244
6.3.2 The Simplicity Heuristic......Page 245
6.4 Models of Single Spiking Neurons......Page 246
6.4.1 The Leaky Integrate-and-Fire Model......Page 247
6.4.2 The Quadratic Integrate-and-Fire Model......Page 249
6.4.3 The Izhikevich Model......Page 250
6.4.4 Modeling Synaptic Delays......Page 253
6.4.5 Noise......Page 255
6.5 Firing-rate Models......Page 256
6.6.1 Synaptic Plasticity......Page 259
6.6.2 Models of Learning in the Striatum and Cortex......Page 261
6.6.2.1 Discrete-time Models of Learning at Synapses that Lack Fast DA Reuptake......Page 262
6.6.2.2 Discrete-time Models of Learning at Synapses with Fast DA Reuptake......Page 263
6.6.2.3 Modeling DA Release......Page 265
6.6.2.4 Continuous-time Models of Hebbian Learning......Page 267
6.7.1 Single-unit Recording Data......Page 269
6.7.2 Behavioral Data......Page 270
6.7.3 fMRI Data......Page 272
6.7.4 TMS Data......Page 276
6.8 Parameter Estimation and Model Evaluation......Page 277
References......Page 279
7.1 Aggregations......Page 286
7.2.1 Reductionist Approach with N Alternatives......Page 289
7.2.2 Quasi-transitivity......Page 293
7.3.1 Computing Tallies......Page 296
7.3.2 Consequences......Page 297
7.3.3 A Basis for Outcomes......Page 299
7.3.4 Applications......Page 305
7.4 Paired Comparisons: An Analysis......Page 311
7.4.1 Paired Comparison Outcomes......Page 312
7.4.2 Transitivity......Page 315
7.4.3 Transitivity vs. Strong Transitivity......Page 317
7.4.4 Seeking Perversities......Page 318
7.4.5 Basis in Profile Space......Page 320
7.5.1 Voting: How Can These Results Be Used?......Page 324
7.5.2 Resolving Arrow’s Theorem......Page 328
7.5.3 Path-dependency and Quantum Thinking......Page 329
7.5.4 Consistency in Ratio-scaled Comparisons......Page 331
7.6 Summary......Page 334
References......Page 335
8 Categorization Based on Similarity and Features: The Reproducing Kernel Banach Space (RKBS) Approach......Page 337
8.1 Introduction......Page 338
8.1.1 Usefulness of Categories......Page 339
8.1.2 Extant Psychological Models and Issues......Page 340
8.1.3 An Emerging Unifying Framework......Page 342
8.1.4 Plan of Chapter......Page 343
8.2.1 Vector Spaces and Functionals......Page 344
8.2.1.1 Linear Functionals and Norms......Page 345
8.2.1.2 Inner Products and Hilbert Spaces......Page 346
8.2.1.3 Semi-inner Products and Banach Spaces......Page 348
8.2.1.4 Duality Mapping and Generalized Semi-inner Products......Page 349
8.2.1.5 Bases in Hilbert and Banach Spaces......Page 350
8.2.1.6 Frames in Hilbert and Banach Spaces......Page 351
8.2.2 Function Spaces and Reproducing Kernels......Page 352
8.2.2.1 Reproducing Kernel Hilbert Spaces (RKHS)......Page 353
8.2.2.3 Operator-valued Reproducing Kernels......Page 354
8.2.2.4 Measurable Functions as Vector Spaces: Lp(X), M(X), C0(X), etc......Page 356
8.2.2.5 Integral Operators and Spectral Methods......Page 357
8.3.1 Statistical Learning Theory......Page 358
8.3.1.3 Risk of Generalization and Error Decomposition......Page 359
8.3.2 Regularization by Different Norms......Page 360
8.3.2.2 Norms to Enforce Sparsity......Page 361
8.4.1.1 Representer Theorem......Page 362
8.4.1.2 Feature Spaces and Feature Maps......Page 363
8.4.1.3 Kernel Trick......Page 364
8.4.2.2 Sample-based Hypothesis Space with ℓ1-Norm......Page 365
8.4.3 Kernel Method as a Unifying Feature and Similarity......Page 366
8.4.3.1 Dictionary Learning and Feature Selection......Page 367
8.4.3.3 Kernel Learning and Vector-valued Maps......Page 368
8.5.1 Computational Models of Human Categorization......Page 369
8.5.1.1 Exemplar Models......Page 370
8.5.1.2 Prototype Models......Page 371
8.5.1.3 Decision-bound Models......Page 372
8.5.1.4 Connectionist Models......Page 373
8.5.1.5 Bayesian Models......Page 374
8.5.2.1 Relations Among Various Categorization Models......Page 376
8.5.2.2 Unified Account by Reproducing Kernels......Page 377
8.5.2.3 Shepard Kernel and Multidimensional Input Scaling......Page 378
8.5.2.4 Incorporating Attention into Kernel Methods......Page 379
8.5.3.1 Similarity vs. Dissimilarity......Page 380
8.5.3.2 Flexible Feature Representation......Page 381
8.6 Summary: Unifying Feature and Similarity by RKBS......Page 382
References......Page 383
9 The Axiom of Meaningfulness in Science and Geometry......Page 389
Preamble......Page 390
9.1 History and Background......Page 391
9.2 Dimensional Analysis......Page 393
9.2.1 Example: Dimensional Analysis of a Simple Pendulum......Page 394
9.3 The Representational Theory of Measurement......Page 397
9.4 Meaningfulness in Geometry......Page 403
9.5 Comparison of Geometric and Measurement-theoretic Concepts......Page 406
9.6 Dimensional Analysis within the Representational Theory......Page 408
9.7 Meaningfulness as a Principle of Theory Construction......Page 409
9.7.1 An Example of a Meaningfulness Condition......Page 411
9.7.2 Examples of Abstract Axioms with their Representations......Page 412
9.7.3 Associativity and the Pythagorean Theorem......Page 414
9.7.4 Order-invariance under Transformations......Page 415
9.7.5 Some Basic Concepts......Page 416
9.8.2 Permutability and Quasi-permutability......Page 417
9.8.3 The Translation Equation......Page 419
9.8.5 The Autodistributivity Equations......Page 420
9.8.6 The Abstract Lorentz–FitzGerald–Doppler Equations......Page 421
9.9.1 Motivation......Page 422
9.9.2 Defining Meaningfulness......Page 424
9.9.3 Applications......Page 426
9.9.4 The Falmagne and Narens Definition......Page 427
9.10 Propagating Abstract Axioms via Meaningfulness......Page 431
9.10.3 The Meaningful Lorentz–FitzGerald–Doppler Systems......Page 432
9.11.1 Associativity and the Pythagorean Theorem......Page 434
9.11.2 Meaningful Quasi-permutable Laws......Page 435
9.11.3 Quasi-permutability and the Pythagorean Theorem......Page 440
9.11.4 Meaningful Bisymmetric Laws......Page 442
9.12 Order-invariance under Transformations......Page 443
9.12.1 Transformation Families......Page 444
9.12.2 Transformation Classes......Page 447
9.12.3 Transformations Acting on Codes......Page 448
9.12.4 Meaningful Transformations......Page 451
9.12.6 Beer’s Law......Page 454
9.12.7 The Monomial Laws......Page 455
9.12.8 A Counterexample: Van der Waals’ Equation......Page 456
9.13 Principles for Scientific and Geometric Content......Page 458
9.13.1 Intrinsicness......Page 466
References......Page 468
Index......Page 472