The Cambridge Handbook of Computational Psychology

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This book is a definitive reference source for the growing, increasingly more important, and interdisciplinary field of computational cognitive modeling, that is, computational psychology. It combines breadth of coverage with definitive statements by leading scientists in this field. Research in computational cognitive modeling explores the essence of cognition through developing detailed, process-based understanding by specifying computational mechanisms, structures, and processes. Computational models provide both conceptual clarity and precision at the same time. This book substantiates this approach through overviews and many examples.

Author(s): Ron Sun
Edition: 1
Publisher: Cambridge University Press
Year: 2008

Language: English
Pages: 768

0521857414......Page 1
Half-title......Page 3
Title......Page 5
Copyright......Page 6
Contents......Page 7
Preface......Page 9
List of Contributors......Page 11
Part I Introduction......Page 15
1. What Is Computational Cognitive Modeling?......Page 17
2. What Is Computational Cognitive Modeling Good For?......Page 19
3. Multiple Levels of Computational Cognitive Modeling......Page 22
4. Success Stories of the Past......Page 26
5. Directions for the Future......Page 27
6. About This Book......Page 30
7. Conclusions......Page 31
References......Page 32
Part II Cognitive Modeling Paradigms......Page 35
2. Background......Page 37
2.1. Historical Context......Page 38
2.2. Key Properties of Connectionist Models......Page 40
2.3. Neural Plausibility......Page 42
2.4. The Relationship between Connectionist Models and Bayesian Inference......Page 44
3.1. An Interactive Activation Model of Context Effects in Letter Perception (McClelland & Rumelhart, 1981, Rumelhart & McClelland, 1982)......Page 45
3.1.1. What happened next?......Page 47
3.2. On Learning the Past Tense of English Verbs (Rumelhart & McClelland, 1986)......Page 48
3.2.1. What happened next?......Page 51
3.3. Finding Structure in Time (Elman, 1990)......Page 52
3.3.1. What happened next?......Page 56
4.1. Cascade-Correlation and Incremental Neural Network Algorithms......Page 57
4.3. Hybrid Models......Page 58
5. Connectionist Influences on Cognitive Theory......Page 59
5.1. Knowledge Versus Processing......Page 60
5.2. Cognitive Development......Page 61
5.3. The Study of Acquired Disordersin Cognitive Neuropsychology......Page 62
5.4. The Origins of Individual Variability and Developmental Disorders......Page 63
5.5. Future Directions......Page 64
References......Page 65
1. Introduction......Page 74
2. The Basics of Bayesian Inference......Page 77
2.1. Bayes' Rule......Page 78
2.2. Comparing Hypotheses......Page 79
2.3. Parameter Estimation......Page 80
2.4. Model Selection......Page 83
3.1. Bayesian Networks......Page 85
3.2. Representing Probability Distributions over Propositions......Page 87
3.3. Causal Graphical Models......Page 88
3.4. Example: Causal Induction from Contingency Data......Page 89
4. Hierarchical Bayesian Models......Page 94
4.1. Example: Learning about Feature Variability......Page 97
4.2. Example: Property Induction......Page 98
5. Markov Chain Monte Carlo......Page 101
5.1. Example: Inferring Topics from Text......Page 104
6. Conclusion......Page 110
References......Page 111
1. Introduction......Page 116
2. Embodiment, Situatedness,and Dynamical Systems......Page 117
3. Dynamical Systems Thinking: Uniquely Instantiated Dynamics......Page 118
4.1. Activation Fields......Page 124
4.2. Field Dynamics......Page 125
4.3. Behavioral Signaturesof Dynamic Fields......Page 127
5.1. Is the Dynamical Systems Approach Embodied and Situated?......Page 131
5.2. Is the Dynamical Systems Approach Neurally Based?......Page 133
5.3. What Kind of Account Does Dynamical Systems Thinking Generate?......Page 135
Appendix: Dynamical Field Theory of Perseverative Reaching......Page 136
References......Page 138
1.1. What Is Logic-Based Computational Cognitive Modeling?......Page 142
1.2. Level of Description of LCCM......Page 143
1.3. The Ancient Roots of LCCM......Page 144
1.4. LCCM's Sister Discipline: Logic-Based Human-Level AI......Page 145
1.6. Brief Overview of the Three Challenges......Page 146
1.7. Structure of the Chapter......Page 147
2. The Goal of Computational Cognitive Modeling/LCCM......Page 148
3.1.1. Desideratum #1: modeling system 1 versus system 2 cognition......Page 149
3.1.2. Desideratum #2: modeling mental logic-based, mental models-based, and mental metalogic-based reasoning......Page 151
3.1.5. Desideratum #5 puzzle 3: the wise man puzzle......Page 152
3.2. Challenge 2: Unify Cognition via a Comprehensive Theoretical Language......Page 153
4.1. Logical Systems......Page 154
4.1.2. An argument (proof) theory for…......Page 156
4.1.3. Formal semantics for…......Page 158
4.1.5. The alphabet and grammar of…......Page 160
4.2 Sample Declarative Modeling in Conformity to LCCM......Page 166
4.3 Logic-Based Computer Programming......Page 171
5.1. Meeting the Challenge of Mechanizing Human Reasoning......Page 172
5.2. Meeting the Perception/Action Challenge......Page 175
5.3. Meeting the Rigor Challenge......Page 177
6. Limitations and the Future......Page 179
7. Conclusion......Page 180
1. Introduction......Page 185
2.1. Soar......Page 186
2.2. ACT-R......Page 187
2.3. EPIC......Page 188
2.4. CLARION......Page 189
3.1. Working Memory Capacity......Page 190
3.2. Cognitive Performance......Page 192
3.2.2. Hidden computational power......Page 193
3.4. Learning......Page 194
3.4.2. Interpreting instructions stored in memory......Page 195
3.4.3. From implicit to explicit learning......Page 196
3.5.1. Constraints at the level of individual brain cells......Page 197
References......Page 198
Part III Computational Modeling of Various Cognitive Functionalitiesand Domains......Page 202
1. Introduction......Page 204
2.1. The CLS Model......Page 205
2.1.2. Cls model of hippocampal recall......Page 206
2.1.3. Cls model of cortical familiarity......Page 209
2.1.5. Memory decision making: a challenge for recognition memory models......Page 211
2.2. Alternative Models of Perirhinal Familiarity......Page 212
2.2.1. The anti-hebbian model......Page 213
2.2.2. The meeter, myers, & gluck model......Page 214
2.2.3. The oscillating learning algorithm......Page 215
3. Abstract Models of Recognition and Recall......Page 216
3.1. The REM Model of Recognitionand Recall......Page 218
3.1.1. Representative rem results......Page 219
3.2. Differences in How Models Explain Interference......Page 220
3.3. Abstract Models and Dual-Process Theories......Page 221
4. Context, Free Recall, and Active Maintenance......Page 222
4.1. The Temporal Context Model......Page 223
4.1.1. How tcm accounts for recall data......Page 224
4.2.1. Architectures for active maintenance......Page 225
4.2.2. Integrating active maintenance and long-term memory......Page 226
5. Conclusions......Page 228
Point Neuron Activation Function......Page 231
Hebbian Learning......Page 232
Cortical and Hippocampal Model Details......Page 233
References......Page 234
1. Introduction......Page 241
2. Hierarchies and Prototypes......Page 242
2.1. Prototype and Similarity-Based Approaches......Page 244
2.2.1. Category coherence......Page 246
2.2.3. Context sensitivity......Page 247
2.3. Summary......Page 248
3.1. Hinton's (1981) Distributed Model......Page 249
3.2. The Rumelhart Model......Page 252
3.3. Feature Weighting and Category Coherence......Page 254
3.4. Context-Sensitivity......Page 260
4.1. Simple Recurrent Networks......Page 262
4.2. Latent Semantic Analysis......Page 265
4.3. The Sentence Gestalt Model......Page 267
5.1. Category-Specific Semantic Impairment......Page 270
5.2. The Convergence Model......Page 274
5.3. Summary......Page 276
6. Conclusion and Open Issues......Page 277
References......Page 278
1. Introduction......Page 282
1.3. Informal and Formal Models......Page 283
1.4. Types of Representation and Process......Page 284
1.4.3. Rule models......Page 285
1.4.7. Hybrid models......Page 286
2. Exemplar Models......Page 287
2.1. Exemplary Exemplar Models......Page 288
2.2. Similarity......Page 291
2.2.1. Continuous scale, sensitive to differences only......Page 292
2.2.2. Continuous scale, sensitive to matches......Page 293
2.2.3. Nominal scale, sensitive to differences only......Page 294
2.2.4. Nominal scale, sensitive to matches......Page 295
2.2.7. Summary of similarity formalizations......Page 301
2.3.2. Gradient descent on error......Page 302
2.3.3. Systematic or random hill-climbing......Page 303
2.3.4. Bayesian learning......Page 304
2.4.1. No recruitment: pre-loaded exemplars......Page 305
2.4.4. Performance driven recruitment......Page 306
2.5. Response Probability......Page 308
2.6. Response Time and Choice as a Function of Time......Page 309
3. Conclusion......Page 311
References......Page 312
1. Introduction......Page 317
2.1. The Evolution of Utility-Based Models......Page 318
2.2.1. Allais paradox......Page 319
2.2.3. Preference reversals......Page 320
2.2.4. Context-dependent preferences......Page 321
3.1. Heuristic Rule-Based Systems......Page 323
3.2.2. Echo......Page 324
3.3.1. Subsymbolic and symbolic computation in act-r......Page 325
4.1. Sequential Sampling Deliberation Process......Page 326
4.2. Connectionist Network Interpretation......Page 327
4.4. Response Mechanism......Page 328
5.1. Accounting for Violations of Independence and Stochastic Dominance......Page 329
5.2. Accounting for Preference Reversals......Page 330
5.3. Accounting for Context Dependent Preferences......Page 331
6.1. Comparison Among Models......Page 332
References......Page 333
1. Introduction......Page 337
2. Human Inductive Reasoning:The Data......Page 338
2.2. Typicality Effects......Page 339
2.4. Other Phenomena, Including Background Knowledge Effects......Page 340
3.1. Osherson et al. (1990)......Page 342
3.2. Sloman (1993)......Page 343
3.3. Bayesian Model......Page 345
4. Causal Learning and Causal Reasoning......Page 348
5.2. Everything Is Intertwingled......Page 349
5.3. Knowledge Is Power......Page 350
References......Page 351
1. Introduction......Page 354
2. The Simulation of Formal Theories of Reasoning......Page 355
3. The Simulation of Spatial Reasoning Using Mental Models......Page 358
4. The Simulation of Sentential Reasoning Using Mental Models......Page 362
5. Concepts, Models, and Minimization......Page 367
6. General Discussion: The Nature of Human Deductive Reasoning......Page 369
References......Page 371
1. Introduction: Topic, Scope and Viewpoint......Page 374
2. History......Page 376
3.1. Interpret Exhortations......Page 378
3.2. Reason from Abstract Declarative Knowledge......Page 380
3.3.2. Analogy......Page 381
3.3.3. Subsumption......Page 382
3.4. Study Someone Else's Solution......Page 383
4.1. Positive Feedback and Subgoal Satisfaction......Page 384
4.1.2. Create a new rule......Page 386
4.2. Interlude: Learning at Impasses......Page 387
4.3.2. Specialization......Page 389
4.4. Discussion......Page 390
5.1. Optimization at the Knowledge Level......Page 391
5.2. Optimization at the Computational Level......Page 393
5.3. Retrieve Solutions from Memory......Page 394
6. Capture the Statistical Structure of the Environment......Page 395
7. Obstacles and Paths to Further Progress......Page 399
References......Page 403
2. Implicit Cognition: The Phenomena......Page 411
3. Demonstrating That Implicit Learning Is Implicit......Page 414
4. Computational Models of Implicit Learning......Page 416
4.1.1. The auto-associator network......Page 418
4.1.2. The simple recurrent network......Page 420
4.1.3. The memory buffer model......Page 423
4.2. Fragment-Based Models of Implicit Learning......Page 425
4.3. Hybrid Models of Implicit Learning......Page 426
5.1. Rules versus Statistics......Page 428
5.2. Separable Systems?......Page 429
5.3. Conscious versus Unconscious Knowledge......Page 430
Competitive Chunker......Page 431
Comparison......Page 432
References......Page 433
1. Introduction......Page 437
2. Visual Attention......Page 438
2.1. The Base Model......Page 439
2.2. Explicit Computation and Representation of Attention......Page 440
2.3. Interactive Emergence of Attention......Page 443
2.4. Key Issues in Models of Visual Attention......Page 446
3. Models of Goal-DrivenAttentional Control......Page 448
3.1. The Base Model......Page 449
3.2. Extensions and Alternatives to the Base Model......Page 451
3.3. Multi-Tasking......Page 453
3.4. Dual-Task Coordination......Page 456
3.5. Automaticity: Actions Without Attention?......Page 458
3.6. Unresolved Issues and Future Directions......Page 459
4. Conclusion......Page 460
References......Page 461
3. Computational Techniques......Page 466
4. Cascade-Correlation......Page 468
6. Balance Scale......Page 469
7. Past Tense......Page 471
8. Object Permanence......Page 472
9. Artificial Syntax......Page 476
10. Similarity-to-Correlation Shift in Category Learning......Page 477
11. Discrimination-Shift Learning......Page 478
12. Concept and Word Learning......Page 479
13. Abnormal Development......Page 483
14.1. Computational Diversity......Page 484
14.3. Computational Bakeoffs......Page 485
Acknowledgments......Page 487
References......Page 488
1. Introduction......Page 492
2.1. Chomsky and the Symbolic Tradition......Page 493
2.2. Connectionist Psycholinguistics......Page 495
2.3. Probabilistic Models of Language......Page 496
3. From Signal to Word......Page 497
3.2. Exploiting Distributed Representations......Page 499
3.5. Explaining the Acquired Dyslexias......Page 501
3.7. Probabilistic Approaches......Page 502
4.1. Capturing Complexity Judgment and Reading Time Data......Page 503
4.4. Plausibility and Statistics......Page 504
5. Language Acquisition......Page 505
5.1. The Poverty of the Stimulus?......Page 506
5.3. Poverty of the Stimulus, Again......Page 507
5.4. Acquiring Morphological Structure......Page 508
5.5. Acquiring Syntactic Categories......Page 509
5.6. Acquiring Lexical Semantics......Page 510
6. Conclusion and Future Directions......Page 512
References......Page 513
1. Introduction......Page 520
2.1.1. Causal learning......Page 521
2.1.2. Causal reasoning......Page 522
2.2. Impression Formation......Page 523
2.3. Group Perception and Stereotyping......Page 524
2.5. Face Perception......Page 526
2.6. Attitudes and Attitude Change......Page 527
2.7. Personality......Page 530
2.9. The Self......Page 533
2.10. Social Influence......Page 534
2.11. Dynamics of Human Mating Strategies......Page 535
2.12. Group Discussion......Page 537
3. Conclusion......Page 538
References......Page 539
1. Introduction......Page 545
2.1. A Cognitive Simulation of Games......Page 549
2.2. A Cognitive Simulation of Organizations......Page 550
2.3. A Cognitive Simulation of Group Interaction......Page 554
3.1. Dimensions of Cognitive Social Simulation......Page 556
3.2. Issues in Cognitive Social Simulation......Page 558
3.3. Directions of Cognitive Social Simulation......Page 559
References......Page 560
1. Introduction......Page 564
2. Deductive Models......Page 566
3. Schema and Analogy Models......Page 568
4. Probabilistic Models......Page 569
5. Neural Network Models......Page 571
6. Causality......Page 575
References......Page 577
1. Introduction......Page 580
2. Initial Approaches to Cognitive Modeling for Cognitive Engineering......Page 582
2.2. The Path from Unit Tasks through Interactive Routines to Embodiment......Page 583
2.3. The Legacy of Card, Moran, and Newell......Page 584
3.1. Complex Systems......Page 586
3.2.1. A brief history......Page 591
3.2.2. Reasoning from graphs......Page 593
3.2.3. Cognitive engineering models of surfing the web (and other informational displays): information search of massive data sets in heterogeneously designed spaces......Page 594
4. Conclusions......Page 597
References......Page 598
1. Introduction......Page 604
2.1. The Rescorla-Wagner (1972) Model......Page 605
2.2. Models That Learn about the Predictive Values of Absent Cues......Page 607
2.3. Models That Learn about the Predictive Values of Configurations of Cues......Page 610
2.4. Models That Learn about the Relevance of Cues......Page 612
2.5 Late Computation Models......Page 613
3. Challenges for Associative Accounts of Predictive Learning......Page 616
3.1 Use of Prior Knowledge about the Structure of Causal Relationships......Page 617
4. The Search for Boundary Conditions for Different Processes......Page 619
References......Page 621
1. Introduction......Page 627
2. Modeling Early Visual Processing......Page 630
3.1. Overview......Page 632
3.2. A "Qualitative'' Model of Recognition......Page 633
3.2.2. Tirst-order analysis......Page 637
3.2.3. Implementation issues......Page 638
3.2.4. Tests......Page 639
4.1. Overview......Page 641
4.2. Jones et al.'s Model for Incorporating Learned High-Level Influences in Early Perception......Page 642
References......Page 647
1. Introduction......Page 650
2. Cordinate Systems for Motor Control......Page 652
2.2. Actuator Coordinates......Page 653
2.3. Generalized Coordinates......Page 654
3. The Problem of Kinematic Redundancy......Page 656
5. Transforming Plans into Actions......Page 658
6. The Organization of Muscle Synergies in the Spinal Cord......Page 660
7. Motor Primitives and Field Approximation......Page 662
8. A Computational Approach to Adaptive Learning......Page 663
9. Forward and Inverse Models as a Basis for Adaptive Behavior......Page 664
10. Adaptation to State- and Time-Dependent Forces......Page 665
10.1. State Space Models of Motor Learning......Page 667
11. Noise and Uncertainty......Page 669
12. Bayesian Framework for Motor Learning......Page 671
13. Architectures for Neural Computation......Page 672
14. Conclusions......Page 674
References......Page 675
Part IV Concluding Remarks......Page 680
1. Introduction......Page 682
2. The Cognitive Aspects of Cognitive Science......Page 684
3. Emotions and Motivation......Page 688
4. Full-Fledged Humanity......Page 690
5. Social Interaction......Page 691
6. Conclusion......Page 693
References......Page 694
1.1. The Scope of Cognitive Modeling......Page 699
1.2. Levels of Analysis and Explanation......Page 700
2.2. Intrinsic Difficulties in Making Progress......Page 701
4. What Are the Functions of Vision?......Page 702
4.1. The Importance of Mobile Hands......Page 703
4.2. Seeing Processes, Affordances, and Empty Spaces......Page 704
4.3. Seeing Without Recognizing Objects......Page 705
4.5. The Role of Perceptionin Ontology Extension......Page 706
5. Representational Capabilities......Page 707
5.1. Is Language for Communication?......Page 708
5.2. Varieties of Complexity: Scaling Up and Scaling Out......Page 709
6. Are Humans Unique?......Page 711
6.1. Altricial and Precocial Skills in Animals and Robots......Page 712
6.2. Meta-semantic Competence......Page 713
7.1. Sample Competences to be Modeled......Page 714
7.2. Fine-Grained Scenarios are Important......Page 715
8. Resolving Fruitless Disputes by Methodological "Lifting''......Page 716
8.2. The Need to Survey Spaces of Possibilities......Page 717
8.3. Toward an Ontology for Types of Architectures......Page 718
9.1 Organizing Questions......Page 719
9.3 Assessing (Measuring?) Progress......Page 720
10. Conclusion......Page 721
References......Page 722
Author Index......Page 726
Subject Index......Page 750