This book provides a balanced, modern summary of Bayesian and frequentist methods for regression analysis.
Table of Contents
Cover
Bayesian and Frequentist Regression Methods
ISBN 9781441909244 ISBN 9781441909251
Preface
Contents
Chapter 1 Introduction and Motivating Examples
1.1 Introduction
1.2 Model Formulation
1.3 Motivating Examples
1.3.1 Prostate Cancer
1.3.2 Outcome After Head Injury
1.3.3 Lung Cancer and Radon
1.3.4 Pharmacokinetic Data
1.3.5 Dental Growth
1.3.6 Spinal Bone Mineral Density
1.4 Nature of Randomness
1.5 Bayesian and Frequentist Inference
1.6 The Executive Summary
1.7 Bibliographic Notes
Part I
Chapter 2 Frequentist Inference
2.1 Introduction
2.2 Frequentist Criteria
2.3 Estimating Functions
2.4 Likelihood
o 2.4.1 Maximum Likelihood Estimation
o 2.4.2 Variants on Likelihood
o 2.4.3 Model Misspecification
2.5 Quasi-likelihood 2.5.1 Maximum Quasi-likelihood Estimation
o 2.5.2 A More Complex Mean-Variance Model
2.6 Sandwich Estimation
2.7 Bootstrap Methods
o 2.7.1 The Bootstrap for a Univariate Parameter
o 2.7.2 The Bootstrap for Regression
o 2.7.3 Sandwich Estimation and the Bootstrap
2.8 Choice of Estimating Function
2.9 Hypothesis Testing
o 2.9.1 Motivation
o 2.9.2 Preliminaries
o 2.9.3 Score Tests
o 2.9.4 Wald Tests
o 2.9.5 Likelihood Ratio Tests
o 2.9.6 Quasi-likelihood
o 2.9.7 Comparison of Test Statistics
2.10 Concluding Remarks
2.11 Bibliographic Notes
2.12 Exercises
Chapter 3 Bayesian Inference
3.1 Introduction
3.2 The Posterior Distribution and Its Summarization
3.3 Asymptotic Properties of Bayesian Estimators
3.4 Prior Choice
o 3.4.1 Baseline Priors
o 3.4.2 Substantive Priors
o 3.4.3 Priors on Meaningful Scales
o 3.4.4 Frequentist Considerations
3.5 Model Misspecification
3.6 Bayesian Model Averaging
3.7 Implementation
o 3.7.1 Conjugacy
o 3.7.2 Laplace Approximation
o 3.7.3 Quadrature
o 3.7.4 Integrated Nested Laplace Approximations
o 3.7.5 Importance Sampling Monte Carlo
o 3.7.6 Direct Sampling Using Conjugacy
o 3.7.7 Direct Sampling Using the Rejection Algorithm
3.8 Markov Chain Monte Carlo 3.8.1 Markov Chains for Exploring Posterior Distributions
o 3.8.2 The Metropolis-Hastings Algorithm
o 3.8.3 The Metropolis Algorithm
o 3.8.4 The Gibbs Sampler
o 3.8.5 Combining Markov Kernels: Hybrid Schemes
o 3.8.6 Implementation Details
o 3.8.7 Implementation Summary
3.9 Exchangeability
3.10 Hypothesis Testing with Bayes Factors
3.11 Bayesian Inference Based on a Sampling Distribution
3.12 Concluding Remarks
3.13 Bibliographic Notes
3.14 Exercises
Chapter 4 Hypothesis Testing and Variable Selection
4.1 Introduction
4.2 Frequentist Hypothesis Testing
o 4.2.1 Fisherian Approach
o 4.2.2 Neyman-Pearson Approach
o 4.2.3 Critique of the Fisherian Approach
o 4.2.4 Critique of the Neyman-Pearson Approach
4.3 Bayesian Hypothesis Testing with Bayes Factors 4.3.1 Overview of Approaches
o 4.3.2 Critique of the Bayes Factor Approach
o 4.3.3 A Bayesian View of Frequentist Hypothesis Testing
4.4 The Jeffreys-Lindley Paradox
4.5 Testing Multiple Hypotheses: General Considerations
4.6 Testing Multiple Hypotheses: Fixed Number of Tests
o 4.6.1 Frequentist Analysis
o 4.6.2 Bayesian Analysis
4.7 Testing Multiple Hypotheses: Variable Selection
4.8 Approaches to Variable Selection and Modeling
o 4.8.1 Stepwise Methods
o 4.8.2 All Possible Subsets
o 4.8.3 Bayesian Model Averaging
o 4.8.4 Shrinkage Methods
4.9 Model Building Uncertainty
4.10 A Pragmatic Compromise to Variable Selection
4.11 Concluding Comments
4.12 Bibliographic Notes
4.13 Exercises
Part II
Chapter 5 Linear Models
5.1 Introduction
5.2 Motivating Example: Prostate Cancer
5.3 Model Specifiation
5.4 A Justificatio for Linear Modeling
5.5 Parameter Interpretation
o 5.5.1 Causation Versus Association
o 5.5.2 Multiple Parameters
o 5.5.3 Data Transformations
5.6 Frequentist Inference 5.6.1 Likelihood
o 5.6.2 Least Squares Estimation
o 5.6.3 The Gauss-Markov Theorem
o 5.6.4 Sandwich Estimation
5.7 Bayesian Inference
5.8 Analysis of Variance
o 5.8.1 One-Way ANOVA
o 5.8.2 Crossed Designs
o 5.8.3 Nested Designs
o 5.8.4 Random and Mixed Effects Models
5.9 Bias-Variance Trade-Off
5.10 Robustness to Assumptions
o 5.10.1 Distribution of Errors
o 5.10.2 Nonconstant Variance
o 5.10.3 Correlated Errors
5.11 Assessment of Assumptions
o 5.11.1 Review of Assumptions
o 5.11.2 Residuals and In uence
o 5.11.3 Using the Residuals
5.12 Example: Prostate Cancer
5.13 Concluding Remarks
5.14 Bibliographic Notes
5.15 Exercises
Chapter 6 General Regression Models
6.1 Introduction
6.2 Motivating Example: Pharmacokinetics of Theophylline
6.3 Generalized Linear Models
6.4 Parameter Interpretation
6.5 Likelihood Inference for GLMs 6.5.1 Estimation
o 6.5.2 Computation
o 6.5.3 Hypothesis Testing
6.6 Quasi-likelihood Inference for GLMs
6.7 Sandwich Estimation for GLMs
6.8 Bayesian Inference for GLMs
o 6.8.1 Prior Specification
o 6.8.2 Computation
o 6.8.3 Hypothesis Testing
o 6.8.4 Overdispersed GLMs
6.9 Assessment of Assumptions for GLMs
6.10 Nonlinear Regression Models
6.11 Identifiabilit
6.12 Likelihood Inference for Nonlinear Models 6.12.1 Estimation
o 6.12.2 Hypothesis Testing
6.13 Least Squares Inference
6.14 Sandwich Estimation for Nonlinear Models
6.15 The Geometry of Least Squares
6.16 Bayesian Inference for Nonlinear Models
o 6.16.1 Prior Specification
o 6.16.2 Computation
o 6.16.3 Hypothesis Testing
6.17 Assessment of Assumptions for Nonlinear Models
6.18 Concluding Remarks
6.19 Bibliographic Notes
6.20 Exercises
Chapter 7 Binary Data Models
7.1 Introduction
7.2 Motivating Examples 7.2.1 Outcome After Head Injury
o 7.2.2 Aircraft Fasteners
o 7.2.3 Bronchopulmonary Dysplasia
7.3 The Binomial Distribution 7.3.1 Genesis
o 7.3.2 Rare Events
7.4 Generalized Linear Models for Binary Data 7.4.1 Formulation
o 7.4.2 Link Functions
7.5 Overdispersion
7.6 Logistic Regression Models 7.6.1 Parameter Interpretation
o 7.6.2 Likelihood Inference for Logistic Regression Models
o 7.6.3 Quasi-likelihood Inference for Logistic Regression Models
o 7.6.4 Bayesian Inference for Logistic Regression Models
7.7 Conditional Likelihood Inference
7.8 Assessment of Assumptions
7.9 Bias, Variance, and Collapsibility
7.10 Case-Control Studies
o 7.10.1 The Epidemiological Context
o 7.10.2 Estimation for a Case-Control Study
o 7.10.3 Estimation for a Matched Case-Control Study
7.11 Concluding Remarks
7.12 Bibliographic Notes
7.13 Exercises
Part III
Chapter 8 Linear Models
8.1 Introduction
8.2 Motivating Example: Dental Growth Curves
8.3 The Effciency of Longitudinal Designs
8.4 Linear Mixed Models 8.4.1 The General Framework
o 8.4.2 Covariance Models for Clustered Data
o 8.4.3 Parameter Interpretation for Linear Mixed Models
8.5 Likelihood Inference for Linear Mixed Models
o 8.5.1 Inference for Fixed Effects
o 8.5.2 Inference for Variance Components via Maximum Likelihood
o 8.5.3 Inference for Variance Components via Restricted Maximum Likelihood
o 8.5.4 Inference for Random Effects
8.6 Bayesian Inference for Linear Mixed Models 8.6.1 A Three-Stage Hierarchical Model
o 8.6.2 Hyperpriors
o 8.6.3 Implementation
o 8.6.4 Extensions
8.7 Generalized Estimating Equations 8.7.1 Motivation
o 8.7.2 The GEE Algorithm
o 8.7.3 Estimation of Variance Parameters
8.8 Assessment of Assumptions 8.8.1 Review of Assumptions
o 8.8.2 Approaches to Assessment
8.9 Cohort and Longitudinal Effects
8.10 Concluding Remarks
8.11 Bibliographic Notes
8.12 Exercises
Chapter 9 General Regression Models
9.1 Introduction
9.2 Motivating Examples
o 9.2.1 Contraception Data
o 9.2.2 Seizure Data
o 9.2.3 Pharmacokinetics of Theophylline
9.3 Generalized Linear Mixed Models
9.4 Likelihood Inference for Generalized Linear Mixed Models
9.5 Conditional Likelihood Inference for Generalized Linear Mixed Models
9.6 Bayesian Inference for Generalized Linear Mixed Models 9.6.1 Model Formulation
o 9.6.2 Hyperpriors
9.7 Generalized Linear Mixed Models with Spatial Dependence 9.7.1 A Markov Random Field Prior
o 9.7.2 Hyperpriors
9.8 Conjugate Random Effects Models
9.9 Generalized Estimating Equations for Generalized Linear Models
9.10 GEE2: Connected Estimating Equations
9.11 Interpretation of Marginal and Conditional Regression Coeffiients
9.12 Introduction to Modeling Dependent Binary Data
9.13 Mixed Models for Binary Data 9.13.1 Generalized Linear Mixed Models for Binary Data
o 9.13.2 Likelihood Inference for the Binary Mixed Model
o 9.13.3 Bayesian Inference for the Binary Mixed Model
o 9.13.4 Conditional Likelihood Inference for Binary Mixed Models
9.14 Marginal Models for Dependent Binary Data
o 9.14.1 Generalized Estimating Equations
o 9.14.2 Loglinear Models
o 9.14.3 Further Multivariate Binary Models
9.15 Nonlinear Mixed Models
9.16 Parameterization of the Nonlinear Model
9.17 Likelihood Inference for the Nonlinear Mixed Model
9.18 Bayesian Inference for the Nonlinear Mixed Model
o 9.18.1 Hyperpriors
o 9.18.2 Inference for Functions of Interest
9.19 Generalized Estimating Equations
9.20 Assessment of Assumptions for General Regression Models
9.21 Concluding Remarks
9.22 Bibliographic Notes
9.23 Exercises
Part IV
Chapter 10 Preliminaries for Nonparametric Regression
10.1 Introduction
10.2 Motivating Examples
o 10.2.1 Light Detection and Ranging
o 10.2.2 Ethanol Data
10.3 The Optimal Prediction
o 10.3.1 Continuous Responses
o 10.3.2 Discrete Responses with K Categories
o 10.3.3 General Responses
o 10.3.4 In Practice
10.4 Measures of Predictive Accuracy
o 10.4.1 Continuous Responses
o 10.4.2 Discrete Responses with K Categories
o 10.4.3 General Responses
10.5 A First Look at Shrinkage Methods
o 10.5.1 Ridge Regression
o 10.5.2 The Lasso
10.6 Smoothing Parameter Selection
o 10.6.1 Mallows CP
o 10.6.2 K-Fold Cross-Validation
o 10.6.3 Generalized Cross-Validation
o 10.6.4 AIC for General Models
o 10.6.5 Cross-Validation for Generalized Linear Models
10.7 Concluding Comments
10.8 Bibliographic Notes
10.9 Exercises
Chapter 11 Spline and Kernel Methods
11.1 Introduction
11.2 Spline Methods 11.2.1 Piecewise Polynomials and Splines
o 11.2.2 Natural Cubic Splines
o 11.2.3 Cubic Smoothing Splines
o 11.2.4 B-Splines
o 11.2.5 Penalized Regression Splines
o 11.2.6 A Brief Spline Summary
o 11.2.7 Inference for Linear Smoothers
o 11.2.8 Linear Mixed Model Spline Representation: Likelihood Inference
o 11.2.9 Linear Mixed Model Spline Representation: Bayesian Inference
11.3 Kernel Methods
o 11.3.1 Kernels
o 11.3.2 Kernel Density Estimation
o 11.3.3 The Nadaraya-Watson Kernel Estimator
o 11.3.4 Local Polynomial Regression
11.4 Variance Estimation
11.5 Spline and Kernel Methods for Generalized Linear Models
o 11.5.1 Generalized Linear Models with Penalized Regression Splines
o 11.5.2 A Generalized Linear Mixed Model Spline Representation
o 11.5.3 Generalized Linear Models with Local Polynomials
11.6 Concluding Comments
11.7 Bibliographic Notes
11.8 Exercises
Chapter 12 Nonparametric Regression with Multiple Predictors
12.1 Introduction
12.2 Generalized Additive Models 12.2.1 Model Formulation
o 12.2.2 Computation via Backfittin
12.3 Spline Methods with Multiple Predictors
o 12.3.1 Natural Thin Plate Splines
o 12.3.2 Thin Plate Regression Splines
o 12.3.3 Tensor Product Splines
12.4 Kernel Methods with Multiple Predictors
12.5 Smoothing Parameter Estimation 12.5.1 Conventional Approaches
o 12.5.2 Mixed Model Formulation
12.6 Varying-Coefficien Models
12.7 Regression Trees 12.7.1 Hierarchical Partitioning
o 12.7.2 Multiple Adaptive Regression Splines
12.8 Classificatio
o 12.8.1 Logistic Models with K Classes
o 12.8.2 Linear and Quadratic Discriminant Analysis
o 12.8.3 Kernel Density Estimation and Classificatio
o 12.8.4 Classificatio Trees
o 12.8.5 Bagging
o 12.8.6 Random Forests
12.9 Concluding Comments
12.10 Bibliographic Notes
12.11 Exercises
Part V
Appendix A Differentiation of Matrix Expressions
Appendix B Matrix Results
Appendix C Some Linear Algebra
Appendix D Probability Distributions and Generating Functions
Appendix E Functions of Normal Random Variables
Appendix F Some Results from Classical Statistics
Appendix G Basic Large Sample Theory
References
Index
Author(s): Jon Wakefield
Series: Springer Series in Statistics
Edition: 2013
Publisher: Springer
Year: 2013
Language: English
Pages: 718
Tags: Математика;Теория вероятностей и математическая статистика;Математическая статистика;