Python Deep Learning: Understand how deep neural networks work and apply them to real-world tasks

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Master effective navigation of neural networks, including convolutions and transformers, to tackle computer vision and NLP tasks using Python

Key Features

  • Understand the theory, mathematical foundations and the structure of deep neural networks
  • Become familiar with transformers, large language models, and convolutional networks
  • Learn how to apply them on various computer vision and natural language processing problems Purchase of the print or Kindle book includes a free PDF eBook

Book Description

The field of deep learning has developed rapidly in the past years and today covers broad range of applications. This makes it challenging to navigate and hard to understand without solid foundations. This book will guide you from the basics of neural networks to the state-of-the-art large language models in use today.

The first part of the book introduces the main machine learning concepts and paradigms. It covers the mathematical foundations, the structure, and the training algorithms of neural networks and dives into the essence of deep learning.

The second part of the book introduces convolutional networks for computer vision. We’ll learn how to solve image classification, object detection, instance segmentation, and image generation tasks.

The third part focuses on the attention mechanism and transformers – the core network architecture of large language models. We’ll discuss new types of advanced tasks, they can solve, such as chat bots and text-to-image generation.

By the end of this book, you’ll have a thorough understanding of the inner workings of deep neural networks. You'll have the ability to develop new models or adapt existing ones to solve your tasks. You’ll also have sufficient understanding to continue your research and stay up to date with the latest advancements in the field.

What you will learn

  • Establish theoretical foundations of deep neural networks
  • Understand convolutional networks and apply them in computer vision applications
  • Become well versed with natural language processing and recurrent networks
  • Explore the attention mechanism and transformers
  • Apply transformers and large language models for natural language and computer vision
  • Implement coding examples with PyTorch, Keras, and Hugging Face Transformers
  • Use MLOps to develop and deploy neural network models

Who this book is for

This book is for software developers/engineers, students, data scientists, data analysts, machine learning engineers, statisticians, and anyone interested in deep learning. Prior experience with Python programming is a prerequisite.

Table of Contents

  1. Machine Learning – an Introduction
  2. Neural Networks
  3. Deep Learning Fundamentals
  4. Computer Vision with Convolutional Networks
  5. Advanced Computer Vision Applications
  6. Natural Language Processing and Recurrent Neural Networks
  7. The Attention Mechanism and Transformers
  8. Exploring Large Language Models in Depth
  9. Advanced Applications of Large Language Models
  10. Machine Learning Operations (ML Ops)

Author(s): Ivan Vasilev
Edition: 3
Publisher: Packt Publishing
Year: 2023

Language: English
Commentary: Publisher PDF | Published: November 2023
Pages: 362
City: Birmingham
Tags: Deep Learning; Neural Networks; Large Language Models; Convolutions Networks; Transformers; Computer Vision; Natural Language Processing

Cover
Title Page
Copyright and Credit
Contributors
Table of Contents
Preface
Part 1: Introduction to Neural Networks
Chapter 1: Machine Learning – an Introduction
Technical requirements
Introduction to ML
Different ML approaches
Supervised learning
Unsupervised learning
Reinforcement learning
Components of an ML solution
Neural networks
Introducing PyTorch
Summary
Chapter 2: Neural Networks
Technical requirements
The need for NNs
The math of NNs
Linear algebra
An introduction to probability
Differential calculus
An introduction to NNs
Units – the smallest NN building block
Layers as operations
Multi-layer NNs
Activation functions
The universal approximation theorem
Training NNs
GD
Backpropagation
A code example of an NN for the XOR function
Summary
Chapter 3: Deep Learning Fundamentals
Technical requirements
Introduction to DL
Fundamental DL concepts
Feature learning
The reasons for DL’s popularity
Deep neural networks
Training deep neural networks
Improved activation functions
DNN regularization
Applications of DL
Introducing popular DL libraries
Classifying digits with Keras
Classifying digits with PyTorch
Summary
Part 2: Deep Neural Networks for Computer Vision
Chapter 4: Computer Vision with Convolutional Networks
Technical requirements
Intuition and justification for CNNs
Convolutional layers
A coding example of the convolution operation
Cross-channel and depthwise convolutions
Stride and padding in convolutional layers
Pooling layers
The structure of a convolutional network
Classifying images with PyTorch and Keras
Convolutional layers in deep learning libraries
Data augmentation
Classifying images with PyTorch
Classifying images with Keras
Advanced types of convolutions
1D, 2D, and 3D convolutions
1×1 convolutions
Depthwise separable convolutions
Dilated convolutions
Transposed convolutions
Advanced CNN models
Introducing residual networks
Inception networks
Introducing Xception
Squeeze-and-Excitation Networks
Introducing MobileNet
EfficientNet
Using pre-trained models with PyTorch and Keras
Summary
Chapter 5: Advanced Computer Vision Applications
Technical requirements
Transfer learning (TL)
Transfer learning with PyTorch
Transfer learning with Keras
Object detection
Approaches to object detection
Object detection with YOLO
Object detection with Faster R-CNN
Introducing image segmentation
Semantic segmentation with U-Net
Instance segmentation with Mask R-CNN
Image generation with diffusion models
Introducing generative models
Denoising Diffusion Probabilistic Models
Summary
Part 3: Natural Language Processing and Transformers
Chapter 6: Natural Language Processing and Recurrent Neural Networks
Technical requirements
Natural language processing
Tokenization
Introducing word embeddings
Word2Vec
Visualizing embedding vectors
Language modeling
Introducing RNNs
RNN implementation and training
Backpropagation through time
Vanishing and exploding gradients
Long-short term memory
Gated recurrent units
Implementing text classification
Summary
Chapter 7: The Attention Mechanism and Transformers
Technical requirements
Introducing seq2seq models
Understanding the attention mechanism
Bahdanau attention
Luong attention
General attention
Transformer attention
Implementing TA
Building transformers with attention
Transformer encoder
Transformer decoder
Putting it all together
Decoder-only and encoder-only models
Bidirectional Encoder Representations from Transformers
Generative Pre-trained Transformer
Summary
Chapter 8: Exploring Large Language Models in Depth
Technical requirements
Introducing LLMs
LLM architecture
LLM attention variants
Prefix decoder
Transformer nuts and bolts
Models
Training LLMs
Training datasets
Pre-training properties
FT with RLHF
Emergent abilities of LLMs
Introducing Hugging Face Transformers
Summary
Chapter 9: Advanced Applications of Large Language Models
Technical requirements
Classifying images with Vision Transformer
Using ViT with Hugging Face Transformers
Understanding the DEtection TRansformer
Using DetR with Hugging Face Transformers
Generating images with stable diffusion
Autoencoder
Conditioning transformer
Diffusion model
Using stable diffusion with Hugging Face Transformers
Exploring fine-tuning transformers
Harnessing the power of LLMs with LangChain
Using LangChain in practice
Summary
Part 4: Developing and Deploying Deep Neural Networks
Chapter 10: Machine Learning Operations (MLOps)
Technical requirements
Understanding model development
Choosing an NN framework
PyTorch versus TensorFlow versus JAX
Open Neural Network Exchange
Introducing TensorBoard
Developing NN models for edge devices with TF Lite
Mixed-precision training with PyTorch
Exploring model deployment
Deploying NN models with Flask
Building ML web apps with Gradio
Summary
Index
Other Books You May Enjoy