Natural Language Processing in Action is your guide to creating machines that understand human language using the power of Python with its ecosystem of packages dedicated to NLP and AI.
About the Technology
Recent advances in deep learning empower applications to understand text and speech with extreme accuracy. The result? Chatbots that can imitate real people, meaningful resume-to-job matches, superb predictive search, and automatically generated document summaries—all at a low cost. New techniques, along with accessible tools like Keras and TensorFlow, make professional-quality NLP easier than ever before.
About the Book
Natural Language Processing in Action is your guide to building machines that can read and interpret human language. In it, you'll use readily available Python packages to capture the meaning in text and react accordingly. The book expands traditional NLP approaches to include neural networks, modern deep learning algorithms, and generative techniques as you tackle real-world problems like extracting dates and names, composing text, and answering free-form questions.
What's inside
• Some sentences in this book were written by NLP! Can you guess which ones?
• Working with Keras, TensorFlow, gensim, and scikit-learn
• Rule-based and data-based NLP
• Scalable pipelines
About the Reader
This book requires a basic understanding of deep learning and intermediate Python skills.
About the Author
Hobson Lane, Cole Howard, and Hannes Max Hapke are experienced NLP engineers who use these techniques in production.
Author(s): Hobson Lane, Hannes Hapke, Cole Howard
Edition: 1
Publisher: Manning Publications
Year: 2019
Language: English
Commentary: True PDF
Pages: 544
City: Shelter Island, NY
Tags: Machine Learning;Neural Networks;Deep Learning;Natural Language Processing;Python;Chatbots;Convolutional Neural Networks;Recurrent Neural Networks;Principal Component Analysis;Text Generation;Sentiment Analysis;Batch Processing;Information Extraction;Spam Detection;Topic Modeling;Attention;Perceptron;Semantic Analysis;Long Short-Term Memory;Performance Tuning;word2vec;Scaling;Back-propagation;Latent Dirichlet Allocation;Singular Value Decomposition;Sequence-to-sequence Models
PART 1. WORDY MACHINES
1. Packets of thought (NLP overview)
2. Build your vocabulary (word tokenization)
3. Math with words (TF-IDF vectors)
4. Finding meaning in word counts (semantic analysis)
PART 2. DEEPER LEARNING (NEURAL NETWORKS)
5. Baby steps with neural networks (perceptrons and backpropagation)
6. Reasoning with word vectors (Word2vec)
7. Getting words in order with convolutional neural networks (CNNs)
8. Loopy (recurrent) neural networks (RNNs)
9. Improving retention with long short-term memory networks
10. Sequence-to-sequence models and attention
PART 3. GETTING REAL (REAL-WORLD NLP CHALLENGES)
11. Information extraction (named entity extraction and question answering)
12. Getting chatty (dialog engines)
13. Scaling up (optimization, parallelization, and batch processing)