Artificial Intelligence By Example: Acquire Advanced AI, Machine Learning and Deep Learning design skills

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Understand the fundamentals and develop your own AI solutions in this updated edition packed with many new examples. Learn Apply k-nearest neighbors (KNN) to language translations and explore the opportunities in Google Translate Understand chained algorithms combining unsupervised learning with decision trees Solve the XOR problem with feedforward neural networks (FNN) and build its architecture to represent a data flow graph Learn about meta learning models with hybrid neural networks Create a chatbot and optimize its emotional intelligence deficiencies with tools such as Small Talk and data logging Building conversational user interfaces (CUI) for chatbots Writing genetic algorithms that optimize deep learning neural networks Build quantum computing circuits About Artificial intelligence (AI) has the potential to replicate humans in every field. Artificial Intelligence By Example, Second Edition serves as a starting point for you to understand how AI is built, with the help of intriguing and exciting examples. This book will make you an adaptive thinker and help you apply concepts to real-world scenarios. Using some of the most interesting AI examples, right from computer programs such as a simple chess engine to cognitive chatbots, you will learn how to tackle the machine you are competing with. You will study some of the most advanced machine learning models, understand how to apply AI to blockchain and Internet of Things (IoT), and develop emotional quotient in chatbots using neural networks such as recurrent neural networks (RNNs) and convolutional neural networks (CNNs). This edition also has new examples for hybrid neural networks, combining reinforcement learning (RL) and deep learning (DL), chained algorithms, combining unsupervised learning with decision trees, random forests, combining DL and genetic algorithms, conversational user interfaces (CUI) for chatbots, neuromorphic computing, and quantum computing. By the end of this book, you will understand the fundamentals of AI and have worked through a number of examples that will help you develop your AI solutions. Features AI-based examples to guide you in designing and implementing machine intelligence Build machine intelligence from scratch using artificial intelligence examples Develop machine intelligence from scratch using real artificial intelligence

Author(s): Denis Rothman
Edition: 2
Publisher: Packt Publishing
Year: 2020

Language: English
Pages: xxii+550
City: S.l.

Cover
Copyright
Packt Page
Contributors
Table of Contents
Preface
Chapter 1: Getting Started with Next-Generation Artificial Intelligence through Reinforcement Learning
Reinforcement learning concepts
How to adapt to machine thinking and become an adaptive thinker
Overcoming real-life issues using the three-step approach
Step 1 – describing a problem to solve: MDP in natural language
Watching the MDP agent at work
Step 2 – building a mathematical model: the mathematical representation of the Bellman equation and MDP
From MDP to the Bellman equation
Step 3 – writing source code: implementing the solution in Python
The lessons of reinforcement learning
How to use the outputs
Possible use cases
Machine learning versus traditional applications
Summary
Questions
Further reading
Chapter 2: Building a Reward Matrix – Designing Your Datasets
Designing datasets – where the dream stops and the hard work begins
Designing datasets
Using the McCulloch-Pitts neuron
The McCulloch-Pitts neuron
The Python-TensorFlow architecture
Logistic activation functions and classifiers
Overall architecture
Logistic classifier
Logistic function
Softmax
Summary
Questions
Further reading
Chapter 3: Machine Intelligence – Evaluation Functions and Numerical Convergence
Tracking down what to measure and deciding how to measure it
Convergence
Implicit convergence
Numerically controlled gradient descent convergence
Evaluating beyond human analytic capacity
Using supervised learning to evaluate a result that surpasses human analytic capacity
Summary
Questions
Further reading
Chapter 4: Optimizing Your Solutions with K-Means Clustering
Dataset optimization and control
Designing a dataset and choosing an ML/DL model
Approval of the design matrix
Implementing a k-means clustering solution
The vision
The data
The strategy
The k-means clustering program
The mathematical definition of k-means clustering
The Python program
Saving and loading the model
Analyzing the results
Bot virtual clusters as a solution
The limits of the implementation of the k-means clustering algorithm
Summary
Questions
Further reading
Chapter 5: How to Use Decision Trees to Enhance K-Means Clustering
Unsupervised learning with KMC with large datasets
Identifying the difficulty of the problem
NP-hard – the meaning of P
NP-hard – the meaning of non-deterministic
Implementing random sampling with mini-batches
Using the LLN
The CLT
Using a Monte Carlo estimator
Trying to train the full training dataset
Training a random sample of the training dataset
Shuffling as another way to perform random sampling
Chaining supervised learning to verify unsupervised learning
Preprocessing raw data
A pipeline of scripts and ML algorithms
Step 1 – training and exporting data from an unsupervised ML algorithm
Step 2 – training a decision tree
Step 3 – a continuous cycle of KMC chained to a decision tree
Random forests as an alternative to decision trees
Summary
Questions
Further reading
Chapter 6: Innovating AI with Google Translate
Understanding innovation and disruption in AI
Is AI disruptive?
AI is based on mathematical theories that are not new
Neural networks are not new
Looking at disruption – the factors that are making AI disruptive
Cloud server power, data volumes, and web sharing of the early 21st century
Public awareness
Inventions versus innovations
Revolutionary versus disruptive solutions
Where to start?
Discover a world of opportunities with Google Translate
Getting started
The program
The header
Implementing Google's translation service
Google Translate from a linguist's perspective
Playing with the tool
Linguistic assessment of Google Translate
AI as a new frontier
Lexical field and polysemy
Exploring the frontier – customizing Google Translate with a Python program
k-nearest neighbor algorithm
Implementing the KNN algorithm
The knn_polysemy.py program
Implementing the KNN function in Google_Translate_Customized.py
Conclusions on the Google Translate customized experiment
The disruptive revolutionary loop
Summary
Questions
Further reading
Chapter 7: Optimizing Blockchains with Naive Bayes
Part I – the background to blockchain technology
Mining bitcoins
Using cryptocurrency
PART II – using blockchains to share information in a supply chain
Using blockchains in the supply chain network
Creating a block
Exploring the blocks
Part III – optimizing a supply chain with naive Bayes in a blockchain process
A naive Bayes example
The blockchain anticipation novelty
The goal – optimizing storage levels using blockchain data
Implementation of naive Bayes in Python
Gaussian naive Bayes
Summary
Questions
Further reading
Chapter 8: Solving the XOR Problem with a Feedforward Neural Network
The original perceptron could not solve the XOR function
XOR and linearly separable models
Linearly separable models
The XOR limit of a linear model, such as the original perceptron
Building an FNN from scratch
Step 1 – defining an FNN
Step 2 – an example of how two children can solve the XOR problem every day
Implementing a vintage XOR solution in Python with an FNN and backpropagation
A simplified version of a cost function and gradient descent
Linear separability was achieved
Applying the FNN XOR function to optimizing subsets of data
Summary
Questions
Further reading
Chapter 9: Abstract Image Classification with Convolutional Neural Networks (CNNs)
Introducing CNNs
Defining a CNN
Initializing the CNN
Adding a 2D convolution layer
Kernel
Shape
ReLU
Pooling
Next convolution and pooling layer
Flattening
Dense layers
Dense activation functions
Training a CNN model
The goal
Compiling the model
The loss function
The Adam optimizer
Metrics
The training dataset
Data augmentation
Loading the data
The testing dataset
Data augmentation on the testing dataset
Loading the data
Training with the classifier
Saving the model
Next steps
Summary
Questions
Further reading and references
Chapter 10: Conceptual Representation Learning
Generating profit with transfer learning
The motivation behind transfer learning
Inductive thinking
Inductive abstraction
The problem AI needs to solve
The  gap concept
Loading the trained TensorFlow 2.x model
Loading and displaying the model
Loading the model to use it
Defining a strategy
Making the model profitable by using it for another problem
Domain learning
How to use the programs
The trained models used in this section
The trained model program
Gap – loaded or underloaded
Gap – jammed or open lanes
Gap datasets and subsets
Generalizing the  (the gap conceptual dataset)
The motivation of conceptual representation learning metamodels applied to dimensionality
The curse of dimensionality
The blessing of dimensionality
Summary
Questions
Further reading
Chapter 11: Combining Reinforcement Learning and Deep Learning
Planning and scheduling today and tomorrow
A real-time manufacturing process
Amazon must expand its services to face competition
A real-time manufacturing revolution
CRLMM applied to an automated apparel manufacturing process
An apparel manufacturing process
Training the CRLMM
Generalizing the unit training dataset
Food conveyor belt processing – positive p and negative n gaps
Running a prediction program
Building the RL-DL-CRLMM
A circular process
Implementing a CNN-CRLMM to detect gaps and optimize
Q-learning – MDP
MDP inputs and outputs
The optimizer
The optimizer as a regulator
Finding the main target for the MDP function
A circular model – a stream-like system that never starts nor ends
Summary
Questions
Further reading
Chapter 12: AI and the Internet of Things (IoT)
The public service project
Setting up the RL-DL-CRLMM model
Applying the model of the CRLMM
The dataset
Using the trained model
Adding an SVM function
Motivation – using an SVM to increase safety levels
Definition of a support vector machine
Python function
Running the CRLMM
Finding a parking space
Deciding how to get to the parking lot
Support vector machine
The itinerary graph
The weight vector
Summary
Questions
Further reading
Chapter 13: Visualizing Networks with TensorFlow 2.x and TensorBoard
Exploring the output of the layers of a CNN in two steps with TensorFlow
Building the layers of a CNN
Processing the visual output of the layers of a CNN
Analyzing the visual output of the layers of a CNN
Analyzing the accuracy of a CNN using TensorBoard
Getting started with Google Colaboratory
Defining and training the model
Introducing some of the measurements
Summary
Questions
Further reading
Chapter 14: Preparing the Input of Chatbots with Restricted Boltzmann Machines (RBMs) and Principal Component Analysis (PCA)
Defining basic terms and goals
Introducing and building an RBM
The architecture of an RBM
An energy-based model
Building the RBM in Python
Creating a class and the structure of the RBM
Creating a training function in the RBM class
Computing the hidden units in the training function
Random sampling of the hidden units for the reconstruction and contractive divergence
Reconstruction
Contrastive divergence
Error and energy function
Running the epochs and analyzing the results
Using the weights of an RBM as feature vectors for PCA
Understanding PCA
Mathematical explanation
Using TensorFlow's Embedding Projector to represent PCA
Analyzing the PCA to obtain input entry points for a chatbot
Summary
Questions
Further reading
Chapter 15: Setting Up a Cognitive NLP UI/CUI Chatbot
Basic concepts
Defining NLU
Why do we call chatbots "agents"?
Creating an agent to understand Dialogflow
Entities
Intents
Context
Adding fulfillment functionality to an agent
Defining fulfillment
Enhancing the cogfilmdr agent with a fulfillment webhook
Getting the bot to work on your website
Machine learning agents
Using machine learning in a chatbot
Speech-to-text
Text-to-speech
Spelling
Why are these machine learning algorithms important?
Summary
Questions
Further reading
Chapter 16: Improve the Emotional Intelligence Deficiencies of Chatbots
From reacting to emotions, to creating emotions
Solving the problems of emotional polysemy
The greetings problem example
The affirmation example
The speech recognition fallacy
The facial analysis fallacy
Small talk
Courtesy
Emotions
Data logging
Creating emotions
RNN research for future automatic dialog generation
RNNs at work
RNN, LSTM, and vanishing gradients
Text generation with an RNN
Vectorizing the text
Building the model
Generating text
Summary
Questions
Further reading
Chapter 17: Genetic Algorithms in Hybrid Neural Networks
Understanding evolutionary algorithms
Heredity in humans
Our cells
How heredity works
Evolutionary algorithms
Going from a biological model to an algorithm
Basic concepts
Building a genetic algorithm in Python
Importing the libraries
Calling the algorithm
The main function
The parent generation process
Generating a parent
Fitness
Display parent
Crossover and mutation
Producing generations of children
Summary code
Unspecified target to optimize the architecture of a neural network with a genetic algorithm
A physical neural network
What is the nature of this mysterious S-FNN?
Calling the algorithm cell
Fitness cell
ga_main() cell
Artificial hybrid neural networks
Building the LSTM
The goal of the model
Summary
Questions
Further reading
Chapter 18: Neuromorphic Computing
Neuromorphic computing
Getting started with Nengo
Installing Nengo and Nengo GUI
Creating a Python program
A Nengo ensemble
Nengo neuron types
Nengo neuron dimensions
A Nengo node
Connecting Nengo objects
Visualizing data
Probes
Applying Nengo's unique approach to critical AI research areas
Summary
Questions
References
Further reading
Chapter 19: Quantum Computing
The rising power of quantum computers
Quantum computer speed
Defining a qubit
Representing a qubit
The position of a qubit
Radians, degrees, and rotations
The Bloch sphere
Composing a quantum score
Quantum gates with Quirk
A quantum computer score with Quirk
A quantum computer score with IBM Q
A thinking quantum computer
Representing our mind's concepts
Expanding MindX's conceptual representations
The MindX experiment
Preparing the data
Transformation functions – the situation function
Transformation functions – the quantum function
Creating and running the score
Using the output
Summary
Questions
Further reading
Appendix: Answers to the Questions
Chapter 1 – Getting Started with Next-Generation Artificial Intelligence through Reinforcement Learning
Chapter 2 – Building a Reward Matrix – Designing Your Datasets
Chapter 3 – Machine Intelligence – Evaluation Functions and Numerical Convergence
Chapter 4 – Optimizing Your Solutions with K-Means Clustering
Chapter 5 – How to Use Decision Trees to Enhance K-Means Clustering
Chapter 6 – Innovating AI with Google Translate
Chapter 7 – Optimizing Blockchains with Naive Bayes
Chapter 8 – Solving the XOR Problem with a Feedforward Neural Network
Chapter 9 – Abstract Image Classification with Convolutional Neural Networks (CNNs)
Chapter 10 – Conceptual Representation Learning
Chapter 11 – Combining Reinforcement Learning and Deep Learning
Chapter 12 – AI and the Internet of Things
Chapter 13 – Visualizing Networks with TensorFlow 2.x and TensorBoard
Chapter 14 – Preparing the Input of Chatbots with Restricted Boltzmann Machines (RBMs) and Principal Component Analysis (PCA)
Chapter 15 – Setting Up a Cognitive NLP UI/CUI Chatbot
Chapter 16 – Improve the Emotional Intelligence Deficiencies of Chatbots
Chapter 17 – Genetic Algorithms in Hybrid Neural Networks
Chapter 18 – Neuromorphic Computing
Chapter 19 – Quantum Computing
Other Books You
May Enjoy
Index