Vector Semantics

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This open access book introduces Vector semantics, which links the formal theory of word vectors to the cognitive theory of linguistics. 

The computational linguists and deep learning researchers who developed word vectors have relied primarily on the ever-increasing availability of large corpora and of computers with highly parallel GPU and TPU compute engines, and their focus is with endowing computers with natural language capabilities for practical applications such as machine translation or question answering. Cognitive linguists investigate natural language from the perspective of human cognition, the relation between language and thought, and questions about conceptual universals, relying primarily on in-depth investigation of language in use. 

In spite of the fact that these two schools both have ‘linguistics’ in their name, so far there has been very limited communication between them, as their historical origins, data collection methods, and conceptual apparatuses are quite different. Vector semantics bridges the gap by presenting a formal theory, cast in terms of linear polytopes, that generalizes both word vectors and conceptual structures, by treating each dictionary definition as an equation, and the entire lexicon as a set of equations mutually constraining all meanings. 

Author(s): András Kornai
Series: Cognitive Technologies
Publisher: Springer
Year: 2023

Language: English
Pages: 280
City: Singapore

Preface
Contents
Foundations of non-compositionality
Background
Lexicographic principles
The syntax of definitions
The geometry of definitions
The algebra of definitions
Parallel description
From morphology to syntax
Lexical categories and subcategories
Bound morphemes
Relations
Linking
Naive grammar
Time and space
Space
Time
Indexicals, coercion
Measure
Negation
Background
Negation in the lexicon
Negation in compositional constructions
Double negation
Quantifiers
Disjunction
Valuations and learnability
The likeliness scale
Naive inference (likeliness update)
Learning
Modality
Tense and aspect
The deontic world
Knowledge, belief, emotions
Defaults
Adjectives, gradience, implicature
Adjectives
Gradience
Implicature
Spreading activation
Trainability and real-world knowledge
Proper names
Trainability
Dynamic embeddings
Applications
Fitting to the law
Pragmatic inferencing
Representation building
Explainability
Summary
References
Index
External index
Appendix: 4lang