Algebraic Structures in Natural Language

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Algebraic Structures in Natural Language addresses a central problem in cognitive science concerning the learning procedures through which humans acquire and represent natural language. Until recently algebraic systems have dominated the study of natural language in formal and computational linguistics, AI, and the psychology of language, with linguistic knowledge seen as encoded in formal grammars, model theories, proof theories and other rule-driven devices. Recent work on deep learning has produced an increasingly powerful set of general learning mechanisms which do not apply rule-based algebraic models of representation. The success of deep learning in NLP has led some researchers to question the role of algebraic models in the study of human language acquisition and linguistic representation. Psychologists and cognitive scientists have also been exploring explanations of language evolution and language acquisition that rely on probabilistic methods, social interaction and information theory, rather than on formal models of grammar induction.

This book addresses the learning procedures through which humans acquire natural language, and the way in which they represent its properties. It brings together leading researchers from computational linguistics, psychology, behavioral science and mathematical linguistics to consider the significance of non-algebraic methods for the study of natural language. The text represents a wide spectrum of views, from the claim that algebraic systems are largely irrelevant to the contrary position that non-algebraic learning methods are engineering devices for efficiently identifying the patterns that underlying grammars and semantic models generate for natural language input. There are interesting and important perspectives that fall at intermediate points between these opposing approaches, and they may combine elements of both. It will appeal to researchers and advanced students in each of these fields, as well as to anyone who wants to learn more about the relationship between computational models and natural language.

Author(s): Shalom Lappin, Jean-Philippe Bernardy
Publisher: CRC Press
Year: 2022

Language: English
Pages: 290
City: Boca Raton

Cover
Half Title
Title Page
Copyright Page
Dedication
Contents
Preface
Contributors
Introduction
CHAPTER 1: On the Proper Role of Linguistically Oriented Deep Net Analysis in Linguistic Theorising
CHAPTER 2: What Artificial Neural Networks Can Tell Us about Human Language Acquisition
CHAPTER 3: Grammar through Spontaneous Order
CHAPTER 4: Language is Acquired in Interaction
CHAPTER 5: Why Algebraic Systems aren’t Sufficient for Syntax
CHAPTER 6: Learning Syntactic Structures from String Input
CHAPTER 7: Analysing Discourse Knowledge in Pre-Trained LMs
CHAPTER 8: Linguistically Guided Multilingual NLP
CHAPTER 9: Word Embeddings are Word Story Embeddings (and That’s Fine)
CHAPTER 10: Algebra and Language: Reasons for (Dis)content
CHAPTER 11: Unitary Recurrent Networks
Index