Graph Representation Learning

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Graph-structured data is ubiquitous throughout the natural and social sciences, from telecommunication networks to quantum chemistry. Building relational inductive biases into deep learning architectures is crucial for creating systems that can learn, reason, and generalize from this kind of data. Recent years have seen a surge in research on graph representation learning, including techniques for deep graph embeddings, generalizations of convolutional neural networks to graph-structured data, and neural message-passing approaches inspired by belief propagation. These advances in graph representation learning have led to new state-of-the-art results in numerous domains, including chemical synthesis, 3D vision, recommender systems, question answering, and social network analysis. This book provides a synthesis and overview of graph representation learning. It begins with a discussion of the goals of graph representation learning as well as key methodological foundations in graph theory and network analysis. Following this, the book introduces and reviews methods for learning node embeddings, including random-walk-based methods and applications to knowledge graphs. It then provides a technical synthesis and introduction to the highly successful graph neural network (GNN) formalism, which has become a dominant and fast-growing paradigm for deep learning with graph data. The book concludes with a synthesis of recent advancements in deep generative models for graphs—a nascent but quickly growing subset of graph representation learning.

Author(s): William L. Hamilton
Series: Synthesis Lectures on Artificial Intelligence and Machine Learning
Edition: 1
Publisher: Morgan & Claypool
Year: 2020

Language: English
Commentary: Vector PDF
Pages: 160
City: San Rafael, CA
Tags: Neural Networks; Deep Learning; Social Networks; Network Analysis; Knowledge Graph; Graph Neural Networks

Preface
Acknowledgments
Introduction
What is a Graph?
Multi-Relational Graphs
Feature Information
Machine Learning on Graphs
Node Classification
Relation Prediction
Clustering and Community Detection
Graph Classification, Regression, and Clustering
Background and Traditional Approaches
Graph Statistics and Kernel Methods
Node-Level Statistics and Features
Graph-Level Features and Graph Kernels
Neighborhood Overlap Detection
Local Overlap Measures
Global Overlap Measures
Graph Laplacians and Spectral Methods
Graph Laplacians
Graph Cuts and Clustering
Generalized Spectral Clustering
Toward Learned Representations
Node Embeddings
Neighborhood Reconstruction Methods
An Encoder-Decoder Perspective
The Encoder
The Decoder
Optimizing an Encoder-Decoder Model
Overview of the Encoder-Decoder Approach
Factorization-Based Approaches
Random Walk Embeddings
Random Walk Methods and Matrix Factorization
Limitations of Shallow Embeddings
Multi-Relational Data and Knowledge Graphs
Reconstructing Multi-Relational Data
Loss Functions
Multi-Relational Decoders
Representational Abilities
Graph Neural Networks
The Graph Neural Network Model
Neural Message Passing
Overview of the Message Passing Framework
Motivations and Intuitions
The Basic GNN
Message Passing with Self-Loops
Generalized Neighborhood Aggregation
Neighborhood Normalization
Set Aggregators
Neighborhood Attention
Generalized Update Methods
Concatenation and Skip-Connections
Gated Updates
Jumping Knowledge Connections
Edge Features and Multi-Relational GNNs
Relational Graph Neural Networks
Attention and Feature Concatenation
Graph Pooling
Generalized Message Passing
Graph Neural Networks in Practice
Applications and Loss Functions
GNNs for Node Classification
GNNs for Graph Classification
GNNs for Relation Prediction
Pre-Training GNNs
Efficiency Concerns and Node Sampling
Graph-Level Implementations
Subsampling and Mini-Batching
Parameter Sharing and Regularization
Theoretical Motivations
GNNs and Graph Convolutions
Convolutions and the Fourier Transform
From Time Signals to Graph Signals
Spectral Graph Convolutions
Convolution-Inspired GNNs
GNNs and Probabilistic Graphical Models
Hilbert Space Embeddings of Distributions
Graphs as Graphical Models
Embedding Mean-Field Inference
GNNs and PGMs More Generally
GNNs and Graph Isomorphism
Graph Isomorphism
Graph Isomorphism and Representational Capacity
The Weisfieler–Lehman Algorithm
GNNs and the WL Algorithm
Beyond the WL Algorithm
Generative Graph Models
Traditional Graph Generation Approaches
Overview of Traditional Approaches
Erdös–Rényi Model
Stochastic Block Models
Preferential Attachment
Traditional Applications
Deep Generative Models
Variational Autoencoder Approaches
Node-Level Latents
Graph-Level Latents
Adversarial Approaches
Autoregressive Methods
Modeling Edge Dependencies
Recurrent Models for Graph Generation
Evaluating Graph Generation
Molecule Generation
Conclusion
Bibliography
Author's Biography
Blank Page