This text concentrates on what can be achieved using the likelihood/Fisherian methods of taking into account uncertainty when studying a statistical problem. It takes the concept of the likelihood as the best method for unifying the demands of statistical modeling and theory of inference. Every likelihood concept is illustrated with realistic examples ranging from a simple comparison of two accident rates to complex studies that require generalized linear or semiparametric modeling. The emphasis is on likelihood not as just a device used to produce an estimate, but as an important tool for modeling.
Author(s): Yudi Pawitan
Edition: 1
Publisher: Oxford University Press, USA
Year: 2001
Language: English
Pages: 546
Tags: Математика;Теория вероятностей и математическая статистика;Математическая статистика;
Book Cover......Page 1
9.1 Background results......Page 0
Preface......Page 5
CONTENTS......Page 9
1.1 Prototype of statistical problems......Page 15
1.2 Statistical problems and their models......Page 17
1.3 Statistical uncertainty: inevitable controversies......Page 20
1.4 The emergence of statistics......Page 22
1.5 Fisher and the third way......Page 28
2.1 Classical definition......Page 35
2.2 Examples......Page 38
2.3 Combining likelihoods......Page 41
2.4 Likelihood ratio......Page 43
2.5 Maximum and curvature of likelihood......Page 44
2.6 Likelihood-based intervals......Page 49
2.7 Standard error and Wald statistic......Page 55
2.8 Invariance principle......Page 57
2.9 Practical implications of invariance principle......Page 59
3.1 Sufficiency......Page 67
3.2 Minimal sufficiency......Page 69
3.3 Multiparameter models......Page 72
3.4 Profile likelihood......Page 75
3.5 Calibration in multiparameter case......Page 80
4.1 Binomial or Bernoulli models......Page 89
4.2 Binomial model with under- or overdispersion......Page 92
4.3 Comparing two proportions......Page 94
4.4 Poisson model......Page 98
4.5 Poisson with overdispersion......Page 100
4.6 Traffic deaths example......Page 102
4.7 Aspirin data example......Page 103
4.8 Continuous data......Page 105
4.9 Exponential family......Page 111
4.10 Box-Cox transformation family......Page 118
4.11 Location-scale family......Page 120
5.1 Bias of point estimates......Page 133
5.2 Estimating and reducing bias......Page 135
5.3 Variability of point estimates......Page 139
5.4 Likelihood and P-value......Page 141
5.5 CI and coverage probability......Page 144
5.6 Confidence density, CI and the bootstrap......Page 147
5.7 Exact inference for Poisson model......Page 150
5.9 Nuisance parameters......Page 156
5.10 Criticism of Cis......Page 158
6. Modelling relationships: regression models......Page 165
6.1 Normal linear models......Page 166
6.2 Logistic regression models......Page 170
6.3 Poisson regression models......Page 173
6.4 Nonnormal continuous regression......Page 176
6.5 Exponential family regression models......Page 179
6.6 Deviance in GLM......Page 182
6.7 Iterative weighted least squares......Page 190
6.8 Box-Cox transformation family......Page 194
6.9 Location-scale regression models......Page 197
7.1 Ideal inference machine?......Page 209
7.2 Sufficiency and the likelihood principles......Page 210
7.3 Conditionality principle and ancillarity......Page 212
7.4 Birnbaum's theorem......Page 213
7.5 Sequential experiments and stopping rule......Page 215
7.6 Multiplicity......Page 220
7.7 Questioning the likelihood principle......Page 222
8.2 The mean of 8(8)......Page 231
8.3 The variance of S(8)......Page 232
8.4 Properties of expected Fisher information......Page 235
8.5 Cramer-Rao lower bound......Page 237
8.6 Minimum variance unbiased estimation*......Page 239
8.7 Multiparameter CRLB......Page 242
10. Dealing with nuisance parameters......Page 289
10.1 Inconsistent likelihood estimates......Page 290
10.2 Ideal case: orthogonal parameters......Page 292
10.3 Marginal and conditional iikelihood......Page 294
10.4 Comparing Poisson means......Page 297
10.5 Comparing proportions......Page 299
10.6 Modified profile likelihood*......Page 302
10.7 Estimated likelihood......Page 308
11.1 ARMA models......Page 313
11.2 Markov chains......Page 315
11.3 Replicated Markov chains......Page 318
11.4 Spatial data......Page 321
11.5 Censored/survival data......Page 325
11.6 Survival regression models......Page 330
11.7 Hazard regression and Cox partial likelihood......Page 332
11.8 Poisson point processes......Page 336
11.9 Replicated Poisson processes......Page 340
11.10 Discrete time model for Poisson processes......Page 347
12.1 Motivation......Page 357
12.2 General specification......Page 358
12.3 Exponential family model......Page 360
12.4 General properties......Page 364
12.5 Mixture models......Page 365
12.6 Robust estimation......Page 368
12.7 Estimating infection pattern......Page 370
12.8 Mixed model estimation*......Page 372
12.9 Standard errors......Page 375
13.1 Analysis of Darwin's data......Page 381
13.2 Distance between model and the 'truth'......Page 383
13.3 Maximum likelihood under a wrong model......Page 386
13.4 Large-sample properties......Page 388
13.5 Comparing working models with the AIC......Page 391
13.6 Deriving the Ale......Page 395
14. Estimating equation and quasi-likelihood......Page 401
14.1 Examples......Page 403
14.2 Computing b in nonlinear cases......Page 406
14.3 Asymptotic distribution......Page 409
14.4 Generalized estimating equation......Page 411
14.5 Robust estimation......Page 414
14.6 Asymptotic Properties......Page 420
15.1 Profile likelihood......Page 425
15.2 Double-bootstrap likelihood......Page 429
15.3 BCa bootstrap likelihood......Page 431
15.4 Exponential family model......Page 434
15.5 General cases: M-estimation......Page 436
15.6 Parametric versus empirical likelihood......Page 438
16.1 The need to extend the likelihood......Page 441
16.2 Statistical prediction......Page 443
16.3 Defining extended likelihood......Page 445
17. Random and mixed effects models......Page 451
17.1 Simple random effects models......Page 452
17.2 Normal linear mixed models......Page 455
17.3 Estimating genetic value from family data*......Page 458
17.4 Joint estimation of f3 and b......Page 460
17.5 Computing the variance component via f3 and b......Page 461
17.6 Examples......Page 464
17.7 Extension to several random effects......Page 468
17.8 Generalized linear mixed models......Page 474
17.9 Exact likelihood in GLMM......Page 476
17.10 Approximate likelihood in GLMM......Page 478
18.1 Motivation......Page 491
18.2 Linear mixed models approach......Page 495
18.3 Imposing smoothness using random etrects model......Page 497
18.4 Penalized likelihood approach......Page 499
18.5 Estimate of f given u2 and ul......Page 500
18.6 Estimating the smoothing parameter......Page 503
18.8 Partial linear models......Page 507
18.9 Smoothing nonequispaced data*......Page 508
18.10 Non-Gaussian smoothing......Page 510
18.11 Nonparametric density estimation......Page 515
18.12 Nonnormal smoothness condition*......Page 518
Bibliography......Page 521
INDEX......Page 533