Deep Learning for Natural Language Processing: Creating Neural Networks with Python

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Discover the concepts of deep learning used for natural language processing (NLP), with full-fledged examples of neural network models such as recurrent neural networks, long short-term memory networks, and sequence-2-sequence models. You’ll start by covering the mathematical prerequisites and the fundamentals of deep learning and NLP with practical examples. The first three chapters of the book cover the basics of NLP, starting with word-vector representation before moving onto advanced algorithms. The final chapters focus entirely on implementation, and deal with sophisticated architectures such as RNN, LSTM, and Seq2seq, using Python tools: TensorFlow, and Keras. Deep Learning for Natural Language Processing follows a progressive approach and combines all the knowledge you have gained to build a question-answer chatbot system. This book is a good starting point for people who want to get started in deep learning for NLP. All the code presented in the book will be available in the form of IPython notebooks and scripts, which allow you to try out the examples and extend them in interesting ways. What You Will Learn Gain the fundamentals of deep learning and its mathematical prerequisites Discover deep learning frameworks in Python Develop a chatbot Implement a research paper on sentiment classification Who This Book Is For Software developers who are curious to try out deep learning with NLP.

Author(s): Palash Goyal; Sumit Pandey; Karan Jain
Publisher: Apress
Year: 2018

Language: English
Pages: 277

Table of Contents
About the Authors
About the Technical Reviewer
Acknowledgments
Introduction
Chapter 1: Introduction to Natural Language Processing and Deep Learning
Python Packages
NumPy
Pandas
SciPy
Introduction to Natural Language Processing
What Is Natural Language Processing?
Good Enough, But What Is the Big Deal?
What Makes Natural Language Processing Difficult?
Ambiguity at Word Level
Ambiguity at Sentence Level
Ambiguity at Meaning Level
What Do We Want to Achieve Through Natural Language Processing?
Common Terms Associated with Language Processing
Natural Language Processing Libraries
NLTK
TextBlob
SpaCy
Gensim
Pattern
Stanford CoreNLP
Getting Started with NLP
Text Search Using Regular Expressions
Text to List
Preprocessing the Text
Accessing Text from the Web
Removal of Stopwords
Counter Vectorization
TF-IDF Score
Text Classifier
Introduction to Deep Learning
How Deep Is “Deep”?
What Are Neural Networks?
Basic Structure of Neural Networks
Types of Neural Networks
Feedforward Neural Networks
Convolutional Neural Networks
Recurrent Neural Networks
Encoder-Decoder Networks
Recursive Neural Networks
Multilayer Perceptrons
Stochastic Gradient Descent
Backpropagation
Deep Learning Libraries
Theano
Theano Installation
Theano Examples
TensorFlow
Data Flow Graphs
TensorFlow Installation
TensorFlow Examples
Keras
Keras Installation
Keras Principles
Keras Examples
Next Steps
Chapter 2: Word Vector Representations
Introduction to Word Embedding
Neural Language Model
Word2vec
Skip-Gram Model
Model Components: Architecture
Model Components: Hidden Layer
Model Components: Output Layer
CBOW Model
Subsampling Frequent Words
Negative Sampling
Word2vec Code
Skip-Gram Code
CBOW Code
Next Steps
Chapter 3: Unfolding Recurrent Neural Networks
Recurrent Neural Networks
What Is Recurrence?
Differences Between Feedforward and Recurrent Neural Networks
Recurrent Neural Network Basics
Natural Language Processing and Recurrent Neural Networks
RNNs Mechanism
Training RNNs
Meta Meaning of Hidden State of RNN
Tuning RNNs
Long Short-Term Memory Networks
Components of LSTM
How LSTM Helps to Reduce the Vanishing Gradient Problem
Understanding GRUs
Limitations of LSTMs
Sequence-to-Sequence Models
What Is It?
Bidirectional Encoder
Stacked Bidirectional Encoder
Decoder
Advanced Sequence-to-Sequence Models
Attention Scoring
Teacher Forcing
Peeking
Sequence-to-Sequence Use Case
Next Steps
Chapter 4: Developing a Chatbot
Introduction to Chatbot
Origin of Chatbots
But How Does a Chatbot Work, Anyway?
Why Are Chatbots Such a Big Opportunity?
Building a Chatbot Can Sound Intimidating. Is It Actually?
Conversational Bot
Chatbot: Automatic Text Generation
Next Steps
Chapter 5: Research Paper Implementation: Sentiment Classification
Self-Attentive Sentence Embedding
Proposed Approach
Model
Penalization Term
Visualization
General Case
Sentiment Analysis Case
Research Findings
Implementing Sentiment Classification
Sentiment Classification Code
Model Results
TensorBoard
Model Accuracy and Cost
Case 1
Case 2
Scope for Improvement
Next Steps
Index