Computational Cognition

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Modern science is divided into three parts: natural sciences, engineering sciences and humanities. Over the last millennia, natural and engineering sciences evolved a symbiotic relationship, but humanities still stand apart. Today, however, designing and building a talking robot is a comparatively new challenge for which all three branches are needed. Starting from the idea that designing a theory of computational cognition should be as complete as possible, and trying to answer questions such as “Which ontology is required for building a computational cognition?”, the current book integrates interfaces, components, functional flows, data structure, database schema, and algorithms into a coherent system with an extensive range of cognitive functions, and constitutes the background to the book “Ontology of Communication” recently published by the author (Springer, 2023). Part I discusses ontological distinctions between a sign-based and an agent-based approach, and continues with explanations of the data structure, the content-addressable database schema; the time-linear derivations of the speak and the hear mode; resonating content; induction, deduction, and abduction in inferencing, and concludes with a reconstruction of eight classical syllogisms as a test suite for DBS inferencing in the think mode. Part II complements the literal use of language in the speak and hear mode with a reconstruction of syntactic mood adaptations and figurative use. The database schema of DBS is shown to lend itself not only to the tasks of traditional storage and retrieval, but also of reference, coreference, shadowing, coactivation of resonating content, and selective activation. Part III complements the treatment of individual topics in linguistics, philosophy, and cognitive psychology with an overall software structure in the form of three interacting main components, called the interface, the memory, and the production component.

Author(s): Roland Hausser
Publisher: Springer
Year: 2023

Language: English
Pages: 253

Preface
STRUCTURE OF THE BOOK
Basic Questions
I MODEL IN AGENT OR AGENT IN MODEL?
II DATA-DRIVEN OR SUBSTITUTION-DRIVEN?
III MEDIA AND THEIR DUAL MODALITIES
IV FUNCTIONAL EQUIVALENCE AND UPSCALING CONTINUITY
V ABSTRACTION IN PROGRAMMING
VI SYMBOLIC VS. SUBSYMBOLIC
VII NOW FRONT AND THE ON-BOARD ORIENTATION SYSTEM
VII DATA-DRIVEN BEHAVIOR
Abbreviations Referring to Preceding Work
Contents
Part I Semantics
1. Ontology
1.1 Scientific Observation
1.1.1 ROBOT’S NONLANGUAGE COGNITION OBSERVED BY SCIENTIST
1.1.2 ROBOT’S LANGUAGE COGNITION OBSERVED BY SCIENTIST
1.1.3 COMBINING LANGUAGE AND NONLANGUAGE OBSERVATION
1.2 Agent-Based vs. Sign-Based
1.2.1 ONTOLOGY OF AN AGENT-BASED APPROACH
1.2.2 PRELIMINARY STRUCTURE OF AN AGENT WITH LANGUAGE
1.2.3 DEFINING CONTINGENT MEANING IN SET THEORY
1.2.4 TWO DENOTATION FUNCTIONS
1.2.5 ONTOLOGY OF SIGN-BASED PREDICATE CALCULUS
1.3 Grounding
1.3.1 TYPE AND TOKEN OF THE CONCEPT square
1.3.2 RECOGNITION OF square
1.3.3 ACTION OF REALIZING square
1.3.4 TYPE AND TOKEN OF THE COLOR CALLED blue
1.3.5 RECOGNITION OF blue
1.3.6 ACTION OF REALIZING blue
1.4 Data Structure
1.4.1 BUILDING A FUNCTOR-ARGUMENT BY CROSS-COPYING
1.4.2 BUILDING A COORDINATION BY CROSS-COPYING
1.5 Cognitive Foundation
1.5.1 KINDS AND ORDER OF ATTRIBUTES IN A PROPLET
1.5.2 RELATED NOTIONS IN LINGUISTICS, LOGIC, AND PHILOSOPHY
1.5.3 1ST CORRELATION: SYNTACTIC AND SEMANTIC KIND
1.5.4 2ND CORRELATION: CONTENT KIND AND COMPUT. MECHANISM
1.5.5 COGNITIVE SQUARE OF DBS
1.5.6 CLOSER VIEW OF THE COGNITIVE SQUARE
1.5.7 PROPLETS INSTANTIATING THE COGNITIVE SQUARE OF DBS
1.6 Computational Pattern Matching
1.6.2 TWO KINDS OF COMPUTATIONAL PATTERN MATCHING IN
1.6.3 TYPE-TOKEN MATCHING BETWEEN CORE VALUES
2. Algorithm
2.1 Time-Linear Hear Mode Derivation
2.1.1 ISOLATED LEXICAL ANALYSIS OF Lucy found a big blue square .
2.1.2 THE THREE KINDS OF DBS HEAR MODE OPERATIONS
2.1.3 TIME-LINEAR SURFACE-COMPOSITIONAL HEAR MODE DERIVATION
2.2 Hear Mode Operations
2.2.1 STRUCTURAL PROPERTIES COMMON TO DBS OPERATION KINDS
2.2.2 CROSS-COPYING (person x) AND find WITH SBJ×PRD (line 1)
2.2.3 CROSS-COPYING find AND n_1 WITH PRD×OBJ (line 2)
2.2.4 CROSS-COPYING n_1 AND big WITH DET×ADN (line 3)
2.2.5 CROSS-COPYING big AND blue WITH ADN×ADN (line 4)
2.2.6 ABSORBING square INTO n_1 WITH DET∪CN (line 5)
2.3 A-Memory
2.3.1 TWO-DIMENSIONAL DATABASE SCHEMA OF A-MEMORY
2.3.2 SCHEMATIC EXAMPLE OF A TOKEN LINE
2.3.3 STORING THE CONTENT DERIVED IN 2.1.2 IN A-MEMORY
2.4 Loomlike Clearance of Now Front
2.5 Semantic Relations of Structure
2.5.1 INTRAPROPOSITIONAL RELATIONS OF ENGLISH
2.5.2 EXTRAPROPOSITIONAL RELATIONS OF ENGLISH
2.6 Speak Mode Operations and Laboratory Set-Up
2.6.1 SEMANTIC RELATIONS UNDERLYING SPEAK MODE DERIVATION
2.6.2 NAVIGATING WITH VN FROM find TO lucy (arc 1)
2.6.3 NAVIGATING WITH N1V FROM lucy BACK TO find (arc 2)
3. Content Resonating in Memory
3.1 Reference as a Purely Cognitive Process
3.1.1 IMMEDIATE REFERENCE IN LANGUAGE COMMUNICATION
3.1.2 MEDIATED REFERENCE IN LANGUAGE COMMUNICATION
3.1.3 NONLANGUAGE IMMEDIATE VS. MEDIATED REFERENCE
3.2 Resonating Content: Coactivation by Intersection
3.2.1 DERIVING SEARCH PATTERNS FOR A 2ND DEGREE INTERSECTION
3.2.1 DERIVING SEARCH PATTERNS FOR A 2ND DEGREE INTERSECTION
3.2.2 2ND DEGREE INTERSECTION COACTIVATING big square
3.2.3 EXAMPLE OF A 3RD DEGREE INTERSECTION
3.2.4 THREE SEARCH PATTERNS FOR A 3RD DEGREE INTERSECTION
3.3 Resonating Content: Selective Activation
3.3.1 SELECTIVE ACTIVATION FOLLOWING A COACTIVATION
3.3.2 COACTIVATION OF ONE PROPOSITION FOLLOWED BY ANOTHER
3.3.3 CONTENT SUPPORTING AN EXTRAPROPOSITIONAL COACTIVATION
3.4 DBS Inference
3.5 Induction, Deduction, Abduction
3.5.1 USING A DBS INFERENCE FOR A DEDUCTION
3.5.2 USING THE DBS INFERENCE 3.5.1 FOR AN ABDUCTION
3.6 Data-Driven Application of Operations
3.6.1 FOUR KINDS OF DATA-DRIVEN OPERATIONS
3.6.3 SEQUENTIAL APPLICATION ROUTINE
3.6.4 PARALLEL APPLICATION PRINCIPLE (PAP)
3.6.5 INFERENCES WITH THE SAME ANTECEDENT, USED DEDUCTIVELY
3.6.6 INFERENCES WITH THE SAME CONSEQUENT, USED ABDUCTIVELY
4. Logical Reasoning
4.1 Overview
4.1.1 SCHEMATIC INSTANTIATION OF A CATEGORICAL SYLLOGISM
4.1.2 THE FOUR CATEGORICAL JUDGEMENTS
4.1.3 SET-THEORETIC COUNTERPARTS OF CATEGORIAL JUDGEMENTS
4.1.4 MODUS BARBARA AS A DBS INFERENCE
4.1.5 FROM SYLLOGISM TO DBS INFERENCE
4.1.6 α BE_X IMPLIES α BE_Y
4.1.7 α BE_X IMPLIES α NOT BE_Y
4.1.8 α NOT BE_X IMPLIES α BE_Y.
4.1.9 α NOT BE_X IMPLIES α NOT BE_Y
4.2 Modus Ponendo Ponens
4.2.1 MODUS PONENDO PONENS IN PREDICATE CALCULUS
4.2.2 INSTANTIATING MODUS PONENDO PONENS
4.2.3 REPHRASING MODUS PONENDO PONENS IN DBS
4.2.4 APPLYING MODUS PONENDO PONENS AS FORMALIZED IN DBS
4.3 Modus Tollendo Tollens
4.3.1 MODUS TOLLENDO TOLLENS IN PREDICATE CALCULUS
4.3.2 INSTANTIATING MODUS TOLLENDO TOLLENS
4.3.3 SET-THEORETIC VIEW OF MODUS TOLLENDO TOLLENS
4.3.4 REPHRASING MODUS TOLLENDO TOLLENS IN DBS
4.3.5 APPLYING MODUS TOLLENDO TOLLENS AS FORMALIZED IN DBS
4.4 Modi BARBARA and CELARENT
4.4.1 MODUS BARBARA IN PREDICATE CALCULUS
4.4.2 INSTANTIATING MODUS BARBARA
4.4.3 SET-THEORETIC VIEW OF MODUS BARBARA
4.4.4 REPHRASING MODUS BARBARA IN DBS
4.4.5 APPLYING MODUS BARBARA AS FORMALIZED IN DBS
4.4.6 MODUS CELARENT IN PREDICATE CALCULUS
4.4.7 INSTANTIATING CELARENT IN PREDICATE CALCULUS
4.4.8 SET-THEORETIC VIEW OF MODUS CELARENT
4.4.9 REPHRASING MODUS CELARENT IN DBS
4.4.10 APPLYING MODUS CELARENT AS FORMALIZED IN DBS
4.5 Modi DARII and FERIO
4.5.1 MODUS DARII IN PREDICATE CALCULUS
4.5.2 INSTANTIATING MODUS DARII
4.5.3 SET-THEORETIC VIEW OF MODUS DARII
4.5.4 REPHRASING MODUS DARII IN DBS
4.5.5 APPLYING MODUS DARII AS FORMALIZED IN DBS
4.5.6 MODUS FERIO IN PREDICATE CALCULUS
4.5.7 INSTANTIATING MODUS FERIO
4.5.8 SET-THEORETIC VIEW OF MODUS FERIO
4.5.9 REPHRASING MODUS FERIO IN DBS
4.5.10 APPLYING MODUS FERIO AS FORMALIZED IN DBS
4.6 Modi BAROCO and BOCARDO
4.6.1 MODUS BAROCO IN PREDICATE CALCULUS
4.6.2 INSTANTIATING MODUS BAROCO
4.6.3 SET-THEORETIC VIEW OF MODUS BAROCO
4.6.4 REPHRASING BAROCO IN DBS
4.6.5 BAROCO IN DBS
4.6.6 MODUS BOCARDO IN PREDICATE CALCULUS
4.6.7 INSTANTIATING MODUS BOCARDO
4.6.8 SET-THEORETIC VIEW OF MODUS BOCARDO
4.6.9 REPHRASING BOCARDO IN DBS
4.6.10 APPLYING BOCARDO AS A DBS INFERENCE
Summary
5. Common Sense Reasoning
5.1 Logical vs. Common Sense Reasoning
5.1.1 EXAMPLE OF AN S-INFERENCE (FERIO, 4.5.10)
5.1.2 EXAMPLE OF A C-INFERENCE (CAUSE_AND_EFFECT)
5.1.3 APPLYING THE C-INFERENCE 5.1.2
5.1.4 EXAMPLE OF A C-INFERENCE (COUNTERMEASURE)
5.1.4 EXAMPLE OF A C-INFERENCE (COUNTERMEASURE)
5.1.5 APPLYING THE C-INFERENCE 5.1.4
5.2 Combining Sand C-Inferencing
5.2.1 MIXING S- AND C-INFERENCE IN A TRAIN OF THOUGHT
5.2.2 LEXICAL S-INFERENCE IMPLEMENTING HYPERNYMY
5.2.3 LEXICAL S-INFERENCE FOR abstract-summary SYNONYMY
5.2.4 LEXICAL S-INFERENCE IMPLEMENTING AN ANTONYMY
5.2.5 LEXICAL S-INFERENCE FOR handsome-good_looking SYNONYMY
5.3 Validity and Balance
5.3.1 Lucy found a big blue square IN PREDICATE CALCULUS
5.3.2 EQUIVALENCES BASED ON TRUTH AND BALANCE OPERATORS
5.4 Analogy
5.4.1 COMMON SENSE REASONING BASED ON ANALOGY
5.4.2 APPLYING THE C-INFERENCE 2 OF 5.4.1
5.5 Hierarchy Formation
5.5.1 TEXT SUMMARIZED AS car accident AND STORED IN A-MEMORY
5.5.2 SUMMARY-CREATING C-INFERENCE
5.6 Navigating vs. Inferencing
5.6.1 SEMANTIC STRUCTURE UNDERLYING MODUS PONENDO PONENS
5.6.2 NAVIGATIONAL ASPECT OF MODUS PONENDO PONENS
Part II Pragmatics
6. STAR-0 Content: Origin
6.1 Context of Interpretation
6.1.1 FIRST PRINCIPLE OF PRAGMATICS (POP-1)
6.1.2 POP-1+ AS A GENERALIZATION OF POP-1
6.1.3 FUNDAMENTAL QUESTION OF USING A CONTENT TYPE
6.1.4 STAR ATTRIBUTES S, T, A, R, 3rd, AND prn
6.2 Four Kinds of Content
6.2.1 NONLANGUAGE CONTENT TYPE
6.2.2 NONLANGUAGE CONTENT TOKEN
6.2.3 LANGUAGE CONTENT TYPE (MEANING)
6.2.4 LANGUAGE CONTENT TOKEN (MEANING )
6.3 On-Board Orientation System: STAR-0
6.3.1 STRUCTURE OF DBS RECOGNITION
6.3.2 INTERNAL RECOGNITION OF I feel hungry. BOUND TO STAR-0
6.3.6 EXTERNAL RECOGNITION You fall into pool. BOUND TO STAR-0
6.3.7 PATTERN CHARACTERIZING 6.3.6 AS EXTERNAL RECOGNITION
6.4 Invariance Constraint and Identity in Change
6.4.1 THREE CONTENT KINDS OF THE SEMANTIC KIND REFERENT
6.4.2 TWO CONTENT KINDS OF THE SEMANTIC KIND PROPERTY
6.4.3 ONE CONTENT KIND OF THE SEMANTIC KIND RELATION
6.4.4 INVARIANCE CONSTRAINT
6.4.5 SYNTACTIC-SEMANTIC INVARIANCE OF FIGURATIVE USE
6.4.6 EXAMPLE USING ALL THREE SEMANTIC KINDS FIGURATIVELY
6.4.7 PRESENCE VS. ABSENCE OF IDENTITY IN CHANGE
6.5 Referent and Shadow in Reference and Coreference
6.5.1 COMPARING THE NAIVE AND THE COMPUTATIONAL SOLUTION
6.5.2 INITIAL OCCURRENCE OF A NOUN PROPLET AT THE NOW FRONT
6.5.3 NOUN PROPLET AFTER LOOM-LIKE NOW FRONT CLEARANCE
6.5.4 NOUN PROPLET REFERRING TO INITIAL INSTANCE
6.5.5 REFERENCE AND COREFERENCE
6.6 Generalized Reference
6.6.1 FOUR CONSTELLATIONS OF GENERALIZED COREFERENCE
6.6.2 INFERENCE DERIVING A NOUN SHADOW
6.6.3 USE OF A KNOWN REFERENT IN A NEW CONTENT
6.6.4 SHADOWING STORED PROPOSITION TO NOW FRONT
6.6.5 USING SHADOWED CONSEQUENT FOR ABDUCTIVE USE
6.6.6 CORRECTING STORED CONTENT
Synopsis: The Semantic-Pragmatic Distinction in DBS
7. STAR-1 Content: Speaker’s Perspective
7.1 From STAR-0 Content to STAR-1 Perspective
7.1.1 STAR-0 CONTENT OF ORIGIN: I see you.
7.1.2 PRAGMATICALLY EQUIVALENT VARIANTS OF STAR-0 CONTENT
7.1.3 STAR-1 CONTENT A: I saw hector. (nonlanguage)
7.1.4 STAR-1 CONTENT B: I saw you.
7.1.5 STAR-1 CONTENT C: I saw him.
7.1.6 STAR-1 CONTENT D:I saw Hector.
7.1.7 STAR-1 CONTENT E: I saw Hector on Thursday.
7.1.8 VARIANT F: STAR-1 CNT. I saw Hector on Thursday in the yard.
7.1.9 STAR-1 SURFACE VARIANTS OF THE STAR-0 CONTENT 7.1.1
7.1.10 SEMANTIC RELATIONS GRAPH UNDERLYING THE 7.1.6 VARIANTS
7.2 Indexical STAR-0 STAR-1 Inference
7.2.1 FIRST INPUT ITEM TO INDEXICAL STAR-0 STAR-1 INFERENCE
7.2.2 SECOND INPUT ITEM TO INDEXICAL STAR-0 STAR-1
7.2.3 OUTPUT OF THE INDEXICAL STAR-0 STAR-1 INFERENCE 7.2.4
7.2.3 OUTPUT OF THE INDEXICAL STAR-0 STAR-1 INFERENCE 7.2.4
7.2.4 INDEXICAL STAR-0 STAR-1 INFERENCE
7.3 Adapting STAR-0 Imperative into STAR-1 Interrogative
7.3.1 INPUT: IMPERATIVE, OUTPUT: INTERROGATIVE
7.3.2 IMP-INT INFERENCE FOR STAR-0 STAR-1 MOOD ADAPTATION
7.3.3 CANONICAL DBS
7.3.4 CANONICAL HEAR MODE DERIVATION OF GERMAN SURFACE
7.4 Adapting STAR-0 Interrogative into STAR-1 Declarative
7.4.1 INPUT: INTERROGATIVE, OUTPUT: DECLARATIVE
7.4.2 INT-DECL INFERENCE FOR STAR-0 STAR-1 MOOD ADAPTATION
7.5 Adapting STAR-0 Imperative into STAR-1 Declarative
7.5.1 INPUT: IMPERATIVE, OUTPUT: DECLARATIVE
7.5.2 IMP-DECL INFERENCE FOR STAR-0 STAR-1 MOOD ADAPTATION
7.5.2 IMP-DECL INFERENCE FOR STAR-0 STAR-1 MOOD ADAPTATION
7.6 Repeated Hearsay
7.6.1 TIME LINE OF A REPEATED HEARSAY LANGUAGE CONTENT
7.6.2 STAR-1 CONTENT OF Tweety said that Hector fell into pool.
7.6.3 STAR-2 CONTENT OF Tweety said that Hector fell into pool.
8. STAR-2 Content: Hearer’s Perspective
8.1 From STAR-1 Surface to STAR-2 Perspective
8.1.1 TIME LINE OF ORIGIN, PRODUCTION, AND INTERPRETATION
8.1.2 PRAGMATICALLY EQUIVALENT VARIANTS OF STAR-1 CONTENTS
8.1.3 STAR-2 CONTENT B: You saw me DERIVED FROM 7.1.4
8.1.4 STAR-2 CONTENT C: you saw him DERIVED FROM 7.1.5
8.1.5 STAR-2 CONTENT D: You saw Hector DERIVED FROM 7.1.6
8.1.6 STAR-2 CONTENT E: You saw Hector on T. DERIVED FROM 7.1.7
8.1.7 STAR-2 CONTENT F: You saw Hector on T. in the y. D. F. 7.1.8)
8.2 Indexical STAR-1 STAR-2 Inference
8.2.1 INDEXICAL STAR-1 STAR-2 INFERENCE OF THE HEAR MODE
8.2.2 STAR-0 TO STAR-1 AND STAR-1 TO STAR-2 TRANSITIONS
8.3 Reverting STAR-1 Interrogative into STAR-2 Imperative
8.3.1 IMP-INT CONVERSION
8.3.2 HEAR MODE USING THE IMP-INT INFERENCE 7.3.2 ABDUCTIVELY
8.3.3 RESULT OF THE HEARER’S STAR1 STAR-2 TRANSITION
8.4 Revert STAR-1 Declarative into STAR-2 Interrogative
8.4.1 ABDUCTIVE HEAR MODE USE OF INT-DECL INFERENCE 7.4.2
8.4.2 INTERMEDIATE CONTENT SERVING AS INPUT TO 8.2.1
8.4.3 RESULT OF THE STAR-1 STAR-2 CONVERSION
8.4.3 RESULT OF THE STAR-1 STAR-2 CONVERSION
8.4.4 INT-DECL CONVERSION IN SPEAK AND HEAR MODE
8.5 Reverting STAR-1 Declarative into STAR-2 Imperative
8.5.1 IMP-DECL CONVERSION IN SPEAK AND HEAR MODE
8.5.2 USING THE IMP-DECL INFERENCE 7.5.2 ABDUCTIVELY
8.5.3 INTERMEDIATE CONTENT RESULTING FROM ABDUCTIVE 8.5.2
8.5.4 FINAL RESULT OF THE STAR-1 STAR-2 CONVERSION
8.6 Imitation: Using a Recognition Content for Action
8.6.1 CONTENT OF John brings Mary flowers and she smiles
8.6.2 ABSTRACT PATTERN DERIVED FROM CONTENT 8.6.1
8.6.3 EXTRAPROPOSITIONAL COORDINATION TURNED INTO INFERENCE
8.6.4 ATTEMPT AT SELF-PERFORMANCE OF THE INFERENCE 8.6.3
8.6.5 INSTANTIATING THE X IN 8.6.5
9. Pragmatics of Figurative Use
9.1 Referent: Hyponymy
9.1.1 LEXICAL S-INFERENCE IMPLEMENTING HYPONYMY
9.1.2 DEDUCTIVE USE OF HYPONYMY INFERENCE (SPEAK MODE)
9.1.3 ABDUCTIVE USE OF HYPONYMY INFERENCE (HEAR MODE)
9.1.4 FIGURATIVE USE BASED ON HYPONYMY RELATION
9.2 Referent: Property-Sharing Metaphor
9.2.1 PROPERTY-SHARING INFERENCE
9.2.2 DEDUCTIVE USE OF PROPERTY-SHARING INFERENCE
9.2.3 ABDUCTIVE USE OF PROPERTY-SHARING INFERENCE
9.2.4 FIGURATIVE USE BASED ON SHARED PROPERTY INFERENCE
9.3 Referent: Metonymy
9.3.1 PART-WHOLE INFERENCE
9.3.2 DEDUCTIVE USE OF METONYMY INFERENCE
9.3.3 ABDUCTIVE USE OF METONYMY INFERENCE
9.3.4 FIGURATIVE USE BASED ON METONYMY RELATION
9.4 Relation: Transitive Verb
9.4.1 FIGURATIVELY USED TRANSITIVE VERBS
9.4.2 SEMANTIC FIELD FOR FIGURATIVE USE OF steal show
9.4.3 SEMANTIC FIELD FOR FIGURATIVE USE OF play market
9.5 Property: Intransitive Verb
9.5.1 METAPHORIC USE OF A PROPERTY
9.5.2 SEMANTIC FIELD OF melt
9.6 Property: Adnominal and Adverbial Modifiers
9.6.1 ABBREVIATING USE OF ADNOMINAL great
9.6.2 INFERENCE CODING ABBREVIATING USE OF great
9.6.3 SPEAK AND HEAR MODE OF AN ABBREVIATING ADNOMINAL USE
9.6.4 DEDUCTIVE USE OF ABBREVIATING ADNOMINAL INFERENCE
9.6.5 ABDUCTIVE USE OF ABBREVIATING ADNOMINAL INFERENCE
9.6.6 INFERENCE CODING ABBREVIATING USE OF have enough
9.6.7 SPEAK AND HEAR MODE OF AN ABBREVIATING ADVERBIAL USE
9.6.8 DEDUCTIVE USE OF ABBREVIATING ADVERBIAL INFERENCE
9.6.9 ABDUCTIVE USE OF ABBREVIATING ADVERBIAL INFERENCE
10. Nonstandard Variants of Pragmatics
10.1 Overhearing
10.1.1 Are you hungry? AS A CONTENT TYPE (meaning1)
10.1.2 STAR-1 INTERPRETATION BY INTENDED HEARER MARY
10.1.3 STAR-2 INTERPRETATION BY INTENDED HEARER MARY
10.1.4 UNINTENDED HEARER INTERPRETS Are you hungry?
10.2 Pragmatics of Demanding Reading
10.2.1 FIRST SENTENCE OF SECTION III BY ALBERT EINSTEIN (1920)
10.2.2 SYNTACTIC-SEMANTIC ANALYSIS OF EINSTEIN’S SENTENCE
10.2.2 SYNTACTIC-SEMANTIC ANALYSIS OF EINSTEIN’S SENTENCE
10.2.3 PROPLET REPRESENTATION OF EINSTEIN’S SENTENCE
10.2.4 VARIATION OF SENTENCE 10.2.1 WITH DIFFERENT CORE VALUES
10.2.5 VARIANT OF THE SEMANTIC RELATIONS GRAPH 10.2.2
10.3 How Repeated Study Changes Interpretation
10.3.1 CONTINUING AFTER SENTENCE 10.2.1
10.3.2 SENTENCE ON P. 32 OF EINSTEIN (1920)
10.3.3 PROPLET REPRESENTATION OF 10.3.2
10.3.4 ADDITIONAL CONTENT BETWEEN FIRST AND SECOND READING
10.4 Language Ritual: Speak Mode
10.4.1 SPEAK MODE: STAR-0 STAR-1 CONVERSION
10.4.2 HEAR MODE: STAR-1 STAR-2 CONVERSION
10.4.3 STAR-0 STAR-1 AND STAR-1 STAR-2 CONVERSIONS
10.4.4 SPEAKER’S DEDUCTIVE APPLICATION OF WEDDING INFERENCE
10.5 Language Ritual: Hear Mode
10.5.1 HEARER’S ABDUCTIVE APPLICATION OF WEDDING INFERENCE
10.5.2 DEDUCTIVE STAR-2 INDEXICAL ADJUSTMENT
10.6 List of DBS Inferences
10.6.1 CONSTELLATIONS OF LOGICAL S-INFERENCES
10.6.2 LEXICAL S-INFERENCES
10.6.3 C-INFERENCES
10.6.4 INFERENCES OF LOGICAL DEDUCTION
10.6.5 INFERENCES FOR INDEXICAL ADAPTATION
10.6.6 INFERENCES FOR SYNTACTIC MOOD ADAPTATION
10.6.7 INFERENCES FOR FIGURATIVE USE
10.6.8 EXPLICIT PERFORMATIVE INFERENCE
10.6.9 INFERENCE FOR SHADOWING
10.6.10 GENERIC INFERENCE
Part III Data-Driven Behavior Control
11. Interface Component
11.1 Three Main Components of Cognition
11.1.1 TWO-DIMENSIONAL LAYOUT OF DBS COGNITION COMPONENTS
11.2 Sensory vs. Processing Media with Their Modalities
11.2.1 MEDIA AND MODALITIES IN NATURAL LANGUAGE
11.2.2 PROCESSING MEDIA AND THEIR DUAL MODALITIES
11.3 Mapping between Concepts and Raw Data
11.3.1 CORRELATION OF CONCEPTS AND SURFACES IN A LANGUAGE
11.3.2 RULE FOR RECOGNIZING THE COLOR
11.3.3 EXAMPLES OF SEMANTIC FIELDS
11.3.4 RULE FOR PRODUCING THE COLOR
11.3.5 SIMILARITY AND DIFFERENCE BETWEEN COLOR CONCEPT TYPES
11.3.6 SURFACES DISTINGUISHING RECOGNITION VS. ACTION USE
11.3.7 ALTERNATIVE USES OF THE ACTION CONCEPT
11.4 Hear Mode: Surface Interpretation
11.4.1 LETTER TYPES MATCHING RAW DATA IN THE VISION MODALITY
11.4.2 THE FOUR STEPS OF THE MORPHEME APPROACH
11.4.3 THE THREE STEPS OF THE ALLOMORPH APPROACH
11.5 Speak Mode: Surface Production
11.5.1 COMPARING WORD FORM RECOGNITION WITH PRODUCTION
11.5.2 TRANSITION 1: FROM CONNECTED PROPLET TO SURFACE TYPE
11.5.3 TRANSITION 2: FROM SURFACE TYPE TO RAW DATA
11.6 Moment by Moment Monitoring
11.6.1 STAR TEMPLATE OF THE ON-BOARD ORIENTATION SYSTEM
11.6.2 STEPS OF SUPPLYING A MISSING STAR-0 VALUE
11.6.3 EXAMPLE OF A PARALLEL APPLICATION
Terminological Remark
12. Memory Component
12.1 String Search
12.1.1 LETTERS OF A TEXT AS A NUMBERED LIST
12.1.2 INVERTED FILE FOR THE ENGLISH SENTENCE IN 12.1.1
12.1.3 EXAMPLE OF A TRIE STRUCTURE STORING young AND est
12.2 Three Memory Sections
12.2.1 VERTICAL AND HORIZONTAL ASPECTS OF A- AND B-MEMORY
12.2.2 DEFINITION OF cook IN B-MEMORY
12.3 Interpretation of Indexicals and Names
12.3.1 DIFFERENT INDEXICAL INTERPRETATIONS OF SAME CONTENT
12.3.2 COREFERENTIAL PRONOUN INTERPRETATION IN COORDINATION
12.4 Storage in the Hear Mode
12.4.1 BASE FORM AND REGULAR DERIVATA SHARING STEM
12.4.2 REGULAR SURFACES WITH THE SAME CORE VALUE
12.4.3 SET OF LANGUAGE PROPLETS SHARING THE CORE VALUE know
12.4.4 OWNER-BASED STORAGE OF LANGUAGE PROPLET AT NOW FRONT
12.5 Retrieval in the Speak Mode
12.5.1 SHADOW-BASED PRODUCTION OF GERMAN SURFACE
12.6 Episodic vs. Generic Contents
12.6.1 DERIVING GENERIC STAR-0 CONTENT BY INFERENCE
13. Operation Component
13.1 Traversing Extrapropositional Coordination
13.1.1 CONTENT OF Julia slept. John sang. Suzy dreamt.
13.1.2 TRAVERSING AN EXTRAPROPOSITIONAL COORDINATION
13.2 Traversing Extrapropositional Functor-Argument
13.2.1 ADVERBIAL MODIFIER CONSTRUCTION AS A SET OF PROPLETS
13.2.2 CANONICAL GRAPH STRUCTURE OF 13.2.1
13.2.3 INTERSECTION PATTERNS FOR FINDING be happy CONTENTS
13.3 Paraphrase as Continuation Alternatives
13.4 Data-Driven Adaptive Cognition
13.5 Functional Flow of Language Cognition
POINTS IN THE LANGUAGE COMMUNICATION CYCLE
13.6 Functional Flow of Nonlanguage Cognition
13.6.1 NONLANGUAGE A|N OPERATION
13.6.2 TRIGGER POINTS OF NONLANGUAGE COGNITION
14. Degrees of Abstraction
14.1 Computational Similarity
14.1.1 STRICT EQUIVALENCE BETWEEN A CONTENT AND A PATTERN
14.1.2 CONVERTING A PATTERN INTO EQUIVALENT CONTENTS
14.1.3 SET OF CONTENTS WITH PARTIAL OVERLAP
14.1.4 REPRESENTING THE SET 14.1.3 AS A SINGLE PATTERN
14.2 Controlling Degrees of Abstraction
14.2.1 REPLACING CONCEPTS WITH INDEXICALS OR VARIABLES
14.2.2 REPLACING VARIABLES WITH CONCEPTS
14.3 On the Fly Derivation of a New Inference
14.3.1 STEP 1: CONVERTING A CONTENT INTO A PATTERN
14.3.2 STEP 2: INDUCTION TURNING PATTERN INTO DBS INFERENCE
14.4 Autonomous Behavior Control
14.4.1 MOST BASIC OPTIONS OF BEHAVIOR CONTROL
14.4.2 FUNCTIONAL FLOW FOR BEHAVIOR WITHOUT PRECEDENT
14.5 Using Control Theory for Maintaining Basic Balance
14.5.1 DISTINCTIONS OF CONTROL THEORY
14.6 Polysemy and Homonymy
14.6.1 ALLEGED EXAMPLES OF POLYSEMY
14.6.2 EVOLUTION OF A HOMONYMY
14.6.3 EVOLUTION OF A POLYSEMY
15. Conclusion: the Fundamental Alternatives
15.1 Continuation vs. Substitution
15.1.1 COMPARING UNBOUNDED DEPENDENCY IN PSG AND DBS
15.2 Input-Output Equivalence
15.2.1 INPUT-OUTPUT EQUIVALENCE IN LANGUAGE COMMUNICATION
15.2.2 DESIGN REQUIREMENTS FOR COMPUTATIONAL DBS COGNITION
15.3 Procedural vs. Metalanguage-Based Semantics
15.3.1 CATEGORIAL GRAMMAR DERIVATION OF Every dog snores.
15.3.2 l λ REDUCTION FOR Every dog snores
15.3.3 NOUN, VERB, AND ADJ FLATTENED INTO MINI-PROPOSITIONS
15.3.4 Every dog finds a bone IN PREDICATE CALCULUS
15.3.5 READINGS ATTRIBUTED TO Every man loves a woman
15.3.6 UNAMBIGUOUS ANALYSIS AS A DBS CONTENT
15.3.7 INCORRECT ANALYSIS OF WELL-FORMED SENTENCE
15.3.8 TREATMENT OF THE DONKEY SENTENCE IN DBS
15.4 Concatenation vs. Embedding
15.4.1 PSG TREE AND EMBEDDING IN EQUIVALENT HPSG ANALYSIS
15.4.2 PSG FOR REPEATING ADNOMINALS
15.4.3 COORDINATION OF ADNOMINALS AS A SET OF PROPLETS
15.4.4 ADN REPETITION AS ANALYZED IN DBS
15.4.5 UNBOUNDED REPETITION IN SUBJECT GAPPING (TExer Sect.5.2)
15.4.6 UNBOUNDED REPET. IN PREDICATE GAPPING (TExer Sect.5.3)
15.4.7 UNBOUNDED REPETITION IN OBJECT GAPPING (TExer Sect.5.4)
15.4.8 UNBOUNDED REPETITION OF PREPOSITIONAL NOUN PHRASES
15.4.9 UNBOUNDED REPETITION OF OBJECT CLAUSES (TExer Sect.5.5)
15.4.10 UNBOUNDED ADNOMINAL CLAUSE REPETITION (TExer Sect.5.6)
15.5 Iteration vs. Recursion
15.5.1 PSG TREE AND EQUIVALENT BRACKETING STRUCTURE
15.5.2 PSG GENERATING BOTH FORMATS IN 15.5.1
15.5.3 LEXICAL PSG RULES WHICH TERMINATE RECURSION IN 15.5.1
15.5.4 ITERATION OF OBJECT CLAUSES IN DBS
15.5.5 John says that Bill believes that Mary loves Jim.
15.6 Content- vs. Coordinate-Addressable Memory
15.6.1 CHARACTERISTIC DIFFERENCES AT A GLANCE
Summary
Bibliography
Name Index
Subject Index