Deep Generative Modeling

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This textbook tackles the problem of formulating AI systems by combining probabilistic modeling and deep learning. Moreover, it goes beyond typical predictive modeling and brings together supervised learning and unsupervised learning. The resulting paradigm, called deep generative modeling, utilizes the generative perspective on perceiving the surrounding world. It assumes that each phenomenon is driven by an underlying generative process that defines a joint distribution over random variables and their stochastic interactions, i.e., how events occur and in what order. The adjective "deep" comes from the fact that the distribution is parameterized using deep neural networks. There are two distinct traits of deep generative modeling. First, the application of deep neural networks allows rich and flexible parameterization of distributions. Second, the principled manner of modeling stochastic dependencies using probability theory ensures rigorous formulation and prevents potential flaws in reasoning. Moreover, probability theory provides a unified framework where the likelihood function plays a crucial role in quantifying uncertainty and defining objective functions. Deep Generative Modeling is designed to appeal to curious students, engineers, and researchers with a modest mathematical background in undergraduate calculus, linear algebra, probability theory, and the basics in machine learning, deep learning, and programming in Python and PyTorch (or other deep learning libraries). It will appeal to students and researchers from a variety of backgrounds, including computer science, engineering, data science, physics, and bioinformatics, who wish to become familiar with deep generative modeling. To engage the reader, the book introduces fundamental concepts with specific examples and code snippets. The full code accompanying the book is available on github. The ultimate aim of the book is to outline the most important techniques in deep generative modeling and, eventually, enable readers to formulate new models and implement them.

Author(s): Jakub M. Tomczak
Publisher: Springer Nature
Year: 2022

Language: English
Pages: 210

Foreword
Preface
Acknowledgments
Contents
1 Why Deep Generative Modeling?
1.1 AI Is Not Only About Decision Making
1.2 Where Can We Use (Deep) Generative Modeling?
1.3 How to Formulate (Deep) Generative Modeling?
1.3.1 Autoregressive Models
1.3.2 Flow-Based Models
1.3.3 Latent Variable Models
1.3.4 Energy-Based Models
1.3.5 Overview
1.4 Purpose and Content of This Book
References
2 Autoregressive Models
2.1 Introduction
2.2 Autoregressive Models Parameterized by Neural Networks
2.2.1 Finite Memory
2.2.2 Long-Range Memory Through RNNs
2.2.3 Long-Range Memory Through Convolutional Nets
2.3 Deep Generative Autoregressive Model in Action!
2.3.1 Code
2.4 Is It All? No!
References
3 Flow-Based Models
3.1 Flows for Continuous Random Variables
3.1.1 Introduction
3.1.2 Change of Variables for Deep Generative Modeling
3.1.3 Building Blocks of RealNVP
3.1.3.1 Coupling Layers
3.1.3.2 Permutation Layers
3.1.3.3 Dequantization
3.1.4 Flows in Action!
3.1.5 Code
3.1.6 Is It All? Really?
3.1.7 ResNet Flows and DenseNet Flows
3.2 Flows for Discrete Random Variables
3.2.1 Introduction
3.2.2 Flows in R or Maybe Rather in Z?
3.2.3 Integer Discrete Flows
3.2.4 Code
3.2.5 What's Next?
References
4 Latent Variable Models
4.1 Introduction
4.2 Probabilistic Principal Component Analysis
4.3 Variational Auto-Encoders: Variational Inference for Non-linear Latent Variable Models
4.3.1 The Model and the Objective
4.3.2 A Different Perspective on the ELBO
4.3.3 Components of VAEs
4.3.3.1 Parameterization of Distributions
4.3.3.2 Reparameterization Trick
4.3.4 VAE in Action!
4.3.5 Code
4.3.6 Typical Issues with VAEs
4.3.7 There Is More!
4.4 Improving Variational Auto-Encoders
4.4.1 Priors
4.4.1.1 Standard Gaussian
4.4.1.2 Mixture of Gaussians
4.4.1.3 VampPrior: Variational Mixture of Posterior Prior
4.4.1.4 GTM: Generative Topographic Mapping
4.4.1.5 GTM-VampPrior
4.4.1.6 Flow-Based Prior
4.4.1.7 Remarks
4.4.2 Variational Posteriors
4.4.2.1 Variational Posteriors with Householder Flows 4:tomczak2016improving
4.4.2.2 Variational Posteriors with Sylvester Flows 4:van2018sylvester
4.4.2.3 Hyperspherical Latent Space
4.5 Hierarchical Latent Variable Models
4.5.1 Introduction
4.5.2 Hierarchical VAEs
4.5.2.1 Two-Level VAEs
4.5.2.2 Top-Down VAEs
4.5.2.3 Code
4.5.2.4 Further Reading
4.5.3 Diffusion-Based Deep Generative Models
4.5.3.1 Introduction
4.5.3.2 Model Formulation
4.5.3.3 Code
4.5.3.4 Discussion
References
5 Hybrid Modeling
5.1 Introduction
5.1.1 Approach 1: Let's Be Naive!
5.1.2 Approach 2: Shared Parameterization!
5.2 Hybrid Modeling
5.3 Let's Implement It!
5.4 Code
5.5 What's Next?
References
6 Energy-Based Models
6.1 Introduction
6.2 Model Formulation
6.3 Training
6.4 Code
6.5 Restricted Boltzmann Machines
6.6 Final Remarks
References
7 Generative Adversarial Networks
7.1 Introduction
7.2 Implicit Modeling with Generative Adversarial Networks(GANs)
7.3 Implementing GANs
7.4 There Are Many GANs Out There!
References
8 Deep Generative Modeling for Neural Compression
8.1 Introduction
8.2 General Compression Scheme
8.3 A Short Detour: JPEG
8.4 Neural Compression: Components
8.5 What's Next?
References
A Useful Facts from Algebra and Calculus
A.1 Norms & Inner Products
A.2 Matrix Calculus
B Useful Facts from Probability Theory and Statistics
B.1 Commonly Used Probability Distributions
B.2 Statistics
Index