Ultimate Neural Network Programming with Python: Create Powerful Modern AI Systems by Harnessing Neural Networks with Python

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Master Neural Networks for Building Modern AI Systems. Book Description This book is a practical guide to the world of Artificial Intelligence (AI), unraveling the math and principles behind applications like Google Maps and Amazon. The book starts with an introduction to Python and AI, demystifies complex AI math, teaches you to implement AI concepts, and explores high-level AI libraries. Throughout the chapters, readers are engaged with the book through practice exercises and supplementary learnings. The book then gradually moves to Neural Networks with Python before diving into constructing ANN models and real-world AI applications. It accommodates various learning styles, letting readers focus on hands-on implementation or mathematical understanding. This book isn't just about using AI tools; it's a compass in the world of AI resources, empowering readers to modify and create tools for complex AI systems. It ensures a journey of exploration, experimentation, and proficiency in AI, equipping readers with the skills needed to excel in the AI industry.

Author(s): Vishal Rajput;
Publisher: Orange Education PVT Ltd
Year: 2023

Language: English
Pages: 401

Cover Page
Title Page
Copyright Page
Dedication Page
About the Author
About the Technical Reviewers
Welcome note
Acknowledgements
Preface
Errata
Table of Contents
1. Understanding AI History
Structure
Evolution of AI
The early history of AI
The most crucial development in the History of AI
AI started evolving into new fields
AI starts taking its modern form
Understanding Intelligent Behavior
AI beats humans at chess
AI learning reasoning and language
AI starts playing poker
Conquering GO and Dota 2
An experience with ChatGPT
Difference between Artificial Intelligence, Machine Learning, and Deep Learning
Formally defining AI terms
Learning representations from data
Sub-Fields of AI
Artificial Intelligence (AI)
Machine Learning (ML)
Deep Learning (DL)
Early Models of Neuron-Inspired Networks
Understanding biological neurons
McCulloch-Pitts model of a neuron
Multilayer Perceptron (MLP)
Conclusion
2. Setting up Python Workflow for AI Development
Structure
Setting up Python Environment
Installing Python
Getting Anaconda for Data Science Environment Setup
Setting up a Virtual Environment
Installing packages
Setting up VS Code
Installing Git
Setting up GitHub with VS Code
Concepts of OOPS
Encapsulation
Accessing Variables
Inheritance
Conclusion
3. Python Libraries for Data Scientists
Structure
Web Scraping
Regex
Multi-Threading and Multi-Processing
Multi-Threading
Multi-Processing
Pandas Basics
Conclusion
4. Foundational Concepts for Effective Neural Network Training
Structure
Activation Functions
RBF, Universal Approximators, and Curse of Dimensionality
Radial Bias Function
Neural Networks are universal approximators
The curse of dimensionality
Overfitting, Bias-Variance, and Generalization
Overfitting problem
Regularization and effective parameters
Dropout
Early stopping and validation set
Bias-Variance trade-off
Generalization
Conclusion
5. Dimensionality Reduction, Unsupervised Learning and Optimizations
Structure
Dimensionality reduction
Principal component analysis (PCA)
T-SNE
Non-linear PCA
Unsupervised learning
Clustering
Semi-supervised learning
Generalizing active learning to multi-class
Self-supervised learning
Version space
Understanding optimization through SVM
Conclusion
6. Building Deep Neural Networks from Scratch
Structure
Coding neurons
A single neuron
Layer of neurons
Understanding lists, arrays, tensors, and their operations
Dot product and vector addition
Cross-product, transpose, and order
Understanding neural networks through NumPy
Neural networks using NumPy
Processing batch of data
Creating a multi-layer network
Dense layers
Activation functions
Calculating loss through categorical cross-entropy loss
Calculating accuracy s
Conclusion
7. Derivatives, Backpropagation, and Optimizers
Structure
Weights Optimization
Derivatives
Partial Derivatives
Backpropagation
Optimizers: SGD, Adam, and so on
Gradient-based optimization
Momentum-based optimization
RMSProp
Adam
Conclusion
8. Understanding Convolution and CNN Architectures
Structure
Intricacies of CNN
Local Patterns and Global Patterns
Spatial Hierarchies and Abstraction
Convolution Operation and Feature Maps
Pooling
Padding
Stride
Introduction to CNN-based Networks
Understanding the Complete Flow of CNN-based Network
VGG16
Inception Module: Naïve and Improved Version
ResNet
Other Variants of ResNet
FractalNet and DenseNet
Scaling Conv Networks: Efficient Net Architecture
Different Types of Convolutions
Depth-Separable Convolution
Conclusion
9. Understanding Basics of TensorFlow and Keras
Structure
A Brief Look at Keras
Understanding TensorFlow Internals
Tensors
Computational Graphs
Operations (Ops)
Automatic Differentiation
Sessions
Variables
Eager Execution
Layers and Models (Keras)
TensorFlow vs. PyTorch vs. Theano
TensorFlow vs. PyTorch
TensorFlow vs. Theano
TensorFlow: Layers, Activations, and More
Types of Layers
Dense Layer (Fully Connected Layer)
Convolution Layer
Max Pooling Layer
Dropout Layer
Recurrent Layer (LSTM)
Embedding Layer
Flatten Layer
Batch Normalization Layer
Global Average Pooling Layer
Upsampling/Transposed Convolution Layer
Activation Functions
Optimizers
Weight Initialization
Loss Functions
Multi-Input Single-Output Network with Custom Callbacks
Conclusion
10. Building End-to-end Image Segmentation Pipeline
Structure
Fine-tuning and Interpretability
Power of Fine-Tuning in Deep Learning
SHAP - An Intuitive Way to Interpret Machine Learning Models
Structuring Deep Learning Code
Project Structure
Python modules and packages
Documentation
Unit testing
Debugging
Logging
Building End-to-end Segmentation Pipeline
UNet and Attention Gates
Config
Dataloader
Model building
Understanding Attention block
Executor
Utils
Evaluation
main
Conclusion
11. Latest Advancements in AI
Structure
Transformers: Improving NLP Using Attention
Recurrent Neural Network (RNN)
Long-Short Term Memory (LSTM)
Self-Attention
Example to understand the concept:
Understanding Key, Query, and Value
Example to understand the concept:
Transformer Architecture
ChatGPT/GPT Overview
Object Detection: Understanding YOLO
Object Detector Architecture Breakdown
Backbone, Neck, and Head
Bag of Freebies (BoF)
CmBN: Cross-mini-Batch Normalization
Bag of Specials (BoS)
Cross-Stage Partial (CSP) Connection
YOLO A rchitecture S election
Spatial Pyramid Pooling (SPP)
PAN Path — Aggregation Block
Spatial Attention Module (SAM)
Image Generation: GAN’s and Diffusion models
Generative Adversarial Networks
Generative Discriminative models
Variational Autoencoders
GANs
Diffusion Models
DALL-E 2 Architecture
The Encoder: Prior Diffusion Model
The Decoder: GLIDE
Conclusion
Index