Bayesian Tensor Decomposition for Signal Processing and Machine Learning: Modeling, Tuning-Free Algorithms, and Applications

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This book presents recent advances of Bayesian inference in structured tensor decompositions. It explains how Bayesian modeling and inference lead to tuning-free tensor decomposition algorithms, which achieve state-of-the-art performances in many applications, including
  • blind source separation;
  • social network mining;
  • image and video processing;
  • array signal processing; and,
  • wireless communications.

The book begins with an introduction to the general topics of tensors and Bayesian theories. It then discusses probabilistic models of various structured tensor decompositions and their inference algorithms, with applications tailored for each tensor decomposition presented in the corresponding chapters. The book concludes by looking to the future, and areas where this research can be further developed.
Bayesian Tensor Decomposition for Signal Processing and Machine Learning is suitable for postgraduates and researchers with interests in tensor data analytics and Bayesian methods.

Author(s): Lei Cheng, Zhongtao Chen, Yik-Chung Wu
Publisher: Springer
Year: 2023

Language: English
Pages: 188
City: Cham

Preface
Contents
1 Tensor Decomposition: Basics, Algorithms, and Recent Advances
1.1 Terminologies and Notations
1.1.1 Scalar, Vector, Matrix, and Tensor
1.1.2 Tensor Unfolding/Matricization
1.1.3 Tensor Products and Norms
1.2 Representation Learning via Tensors
1.2.1 Canonical Polyadic Decomposition (CPD)
1.2.2 Tucker Decomposition (TuckerD)
1.2.3 Tensor Train Decomposition (TTD)
1.3 Model Fitting and Challenges Ahead
1.3.1 Example: Tensor CPD
1.3.2 Challenges in Rank Determination
References
2 Bayesian Learning for Sparsity-Aware Modeling
2.1 Bayes' Theorem
2.2 Bayesian Learning and Sparsity-Aware Learning
2.3 Prior Design for Sparsity-Aware Modeling
2.4 Inference Algorithm Development
2.5 Mean-Field Variational Inference
2.5.1 General Solution
2.5.2 Tractability of MF-VI
2.5.3 Definition of MPCEF Model
2.5.4 Optimal Variational Pdfs for MPCEF Model
References
3 Bayesian Tensor CPD: Modeling and Inference
3.1 A Unified Probabilistic Modeling Using GSM Prior
3.2 PCPD-GG: Probabilistic Modeling
3.3 PCPD-GH: Probabilistic Modeling
3.4 PCPD-GH, PCPD-GG: Inference Algorithm
3.4.1 Optimal Variational Pdfs
3.4.2 Setting the Hyper-parameters
3.5 Algorithm Summary and Insights
3.5.1 Convergence Property
3.5.2 Automatic Tensor Rank Learning
3.5.3 Computational Complexity
3.5.4 Reducing to PCPD-GG
3.6 Non-parametric Modeling: PCPD-MGP
3.7 PCPD-MGP: Inference Algorithm
References
4 Bayesian Tensor CPD: Performance and Real-World Applications
4.1 Numerical Results on Synthetic Data
4.1.1 Simulation Setup
4.1.2 PCPD-GH Versus PCPD-GG
4.1.3 Comparisons with Non-parametric PCPD-MGP
4.2 Real-World Applications
4.2.1 Fluorescence Data Analytics
4.2.2 Hyperspectral Images Denoising
References
5 When Stochastic Optimization Meets VI: Scaling Bayesian CPD to Massive Data
5.1 CPD Problem Reformulation
5.1.1 Probabilistic Model and Inference for the Reformulated Problem
5.2 Interpreting VI Update from Natural Gradient Descent Perspective
5.2.1 Optimal Variational Pdfs in Exponential Family Form
5.2.2 VI Updates as Natural Gradient Descent
5.3 Scalable VI Algorithm for Tensor CPD
5.3.1 Summary of Iterative Algorithm
5.3.2 Further Discussions
5.4 Numerical Examples
5.4.1 Convergence Performance on Synthetic Data
5.4.2 Tensor Rank Estimation on Synthetic Data
5.4.3 Video Background Modeling
5.4.4 Image Feature Extraction
References
6 Bayesian Tensor CPD with Nonnegative Factors
6.1 Tensor CPD with Nonnegative Factors
6.1.1 Motivating Example—Social Group Clustering
6.1.2 General Problem and Challenges
6.2 Probabilistic Modeling for CPD with Nonnegative Factors
6.2.1 Properties of Nonnegative Gaussian-Gamma Prior
6.2.2 Probabilistic Modeling of CPD with Nonnegative Factors
6.3 Inference Algorithm for Tensor CPD with Nonnegative Factors
6.3.1 Derivation for Variational Pdfs
6.3.2 Summary of the Inference Algorithm
6.3.3 Discussions and Insights
6.4 Algorithm Accelerations
6.5 Numerical Results
6.5.1 Validation on Synthetic Data
6.5.2 Fluorescence Data Analysis
6.5.3 ENRON E-mail Data Mining
References
7 Complex-Valued CPD, Orthogonality Constraint, and Beyond Gaussian Noises
7.1 Problem Formulation
7.2 Probabilistic Modeling
7.3 Inference Algorithm Development
7.3.1 Derivation for Q( Ξ(k)), 1 leqk leqP
7.3.2 Derivation for Q ( Ξ(k)), P+1 leqk leqN
7.3.3 Derivation for Q ( mathcalE )
7.3.4 Derivations for Q (γl), Q(ζi1,…,iN ), and Q (β)
7.3.5 Summary of the Iterative Algorithm
7.3.6 Further Discussions
7.4 Simulation Results and Discussions
7.4.1 Validation on Synthetic Data
7.4.2 Blind Data Detection for DS-CDMA Systems
7.4.3 Linear Image Coding for a Collection of Images
References
8 Handling Missing Value: A Case Study in Direction-of-Arrival Estimation
8.1 Linking DOA Subspace Estimation to Tensor Completion
8.2 Probabilistic Modeling
8.3 MPCEF Model Checking and Optimal Variational Pdfs Derivations
8.3.1 MPCEF Model Checking
8.3.2 Optimal Variational Pdfs Derivations
8.4 Algorithm Summary and Remarks
8.5 Simulation Results and Discussions
References
9 From CPD to Other Tensor Decompositions
9.1 Tucker Decomposition (TuckerD)
9.2 Tensor Train Decomposition (TTD)
9.3 PARAFAC2
9.4 Tensor-SVD (T-SVD)
References