Mathematical Statistics: A Unified Introduction

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

This textbook introduces the mathematical concepts and methods that underlie statistics. The course is unified, in the sense that no prior knowledge of probability theory is assumed, being developed as needed. The book is committed to both a high level of mathematical seriousness and to an intimate connection with application. In its teaching style, the book is * mathematically complete * concrete * constructive * active. The text is aimed at the upper undergraduate or the beginning Masters program level. It assumes the usual two-year college mathematics sequence, including an introduction to multiple integrals, matrix algebra, and infinite series.

Author(s): George R. Terrell
Series: Springer Texts in Statistics
Edition: 1999
Publisher: Springer
Year: 1999

Language: English
Commentary: missing pp. i-xii
Pages: 454
City: New York

Contents......Page 1
1.1 Introduction......Page 19
1.2.1 Plotting Data......Page 20
1.2.2 Location Models......Page 21
1.3.1 Data from Several Treatments......Page 22
1.3.2 Centered Models......Page 24
1.3.3 Degrees of Freedom......Page 25
1.4.1 Cross-Classified Observations......Page 26
1.4.2 Additive Models......Page 28
1.4.3 Balanced Designs......Page 29
1.4.4 Interaction......Page 31
1.4.5 Centering Full Models......Page 32
1.5.1 Interpolating Between Levels......Page 33
1.5.2 Simple Linear Regression......Page 35
1.6.1 Double Interpolation......Page 37
1.6.2 Multiple Linear Regression......Page 39
1.7.1 Counted Data......Page 40
1.7.2 Independence Models......Page 42
1.7.3 Loglinear Models......Page 43
1.7.4 Loglinear Independence Models......Page 44
1.7.5 Loglinear Saturated Models*......Page 46
1.8.1 Interpolating in Contingency Tables......Page 47
1.8.2 Linear Logistic Regression......Page 49
1.9 Summary......Page 50
1.10 Exercises......Page 51
1.11 Supplementary Exercises......Page 55
2.1 Introduction......Page 60
2.2.1 Multiple Observations as Vectors......Page 61
2.2.2 Distances as Errors......Page 63
2.3.1 Simple Proportion Models......Page 64
2.3.2 Estimating the Constant......Page 66
2.3.3 Solving the Problem Using Matrix Notation......Page 68
2.3.4 Geometric Degrees of Freedom......Page 69
2.3.5 Schwarz's Inequality......Page 70
2.4.1 Least-Squares Location Estimation......Page 71
2.4.2 Sample Variance......Page 72
2.5.1 Analysis of Variance......Page 73
2.5.2 Geometric Interpretation......Page 75
2.5.3 ANOVA Tables......Page 77
2.5.4 The F-Statistic......Page 78
2.5.5 The Kruskal–Wallis Statistic......Page 80
2.6.1 Estimates for Simple Linear Regression......Page 81
2.6.2 ANOVA for Regression......Page 83
2.7.1 Standardizing the Regression Line......Page 84
2.7.2 Properties of the Sample Correlation......Page 85
2.8.1 ANOVA for Two-Way Layouts......Page 87
2.8.2 Additive Models......Page 89
2.10 Exercises......Page 91
2.11 Supplementary Exercises......Page 94
3.1 Introduction......Page 97
3.2.1 What Is Probability?......Page 98
3.2.2 Probabilities by Counting......Page 99
3.3.1 Basic Rules for Counting......Page 101
3.3.2 Counting Lists......Page 102
3.3.3 Combinations......Page 104
3.3.4 Multinomial Counting......Page 105
3.4.1 Complicated Counts......Page 106
3.4.2 The Birthday Problem......Page 107
3.4.3 General Principles About Probability......Page 108
3.5.1 An Upper Bound......Page 110
3.5.2 A Lower Bound......Page 112
3.5.3 A Useful Approximation......Page 113
3.6 Sampling......Page 114
3.8 Exercises......Page 115
3.9 Supplementary Exercises......Page 118
4.1 Introduction......Page 122
4.2.1 Uniform Geometric Probability......Page 123
4.2.2 General Properties......Page 125
4.3.2 Rules for Combining Events......Page 126
4.4.1 In General......Page 127
4.4.2 Axioms of Probability......Page 128
4.4.3 Consequences of the Axioms......Page 129
4.5.1 Definition......Page 130
4.5.2 Examples......Page 131
4.6.1 Partitions......Page 132
4.6.2 Division into Cases......Page 133
4.6.3 Bayes's Theorem......Page 135
4.6.4 Bayes's Theorem Applied to Partitions......Page 136
4.7.1 Irrelevant Conditions......Page 137
4.7.3 Near-Independence......Page 138
4.8.1 Probability Density......Page 139
4.8.2 Sigma Algebras and Borel Algebras*......Page 142
4.8.3 Kolmogorov's Axiom*......Page 144
4.10 Exercises......Page 147
4.11 Supplementary Exercises......Page 149
5.1 Introduction......Page 152
5.2.1 Some Simple Examples......Page 153
5.2.2 Discrete Random Variables......Page 154
5.2.3 The Negative Hypergeometric Family......Page 155
5.3.1 The Hypergeometric Family......Page 157
5.3.3 Fisher's Test for Independence......Page 159
5.3.5 The Sign Test......Page 161
5.4.1 Some Properties......Page 162
5.4.2 Continuous Variables......Page 163
5.4.3 Symmetry and Duality......Page 165
5.5.1 Average Values......Page 167
5.5.2 Discrete Random Variables......Page 168
5.5.3 The Method of Indicators......Page 169
5.6.2 Compatibility with the Data......Page 171
5.7 Summary......Page 173
5.8 Exercises......Page 174
5.9 Supplementary Exercises......Page 178
6.1 Introduction......Page 181
6.2.1 The Geometric Approximation......Page 182
6.2.3 Negative Binomial Approximations......Page 183
6.2.4 Negative Binomial Variables......Page 184
6.2.5 Convergence in Distribution......Page 185
6.3.1 Binomial Approximations......Page 186
6.3.2 Binomial Random Variables......Page 187
6.3.3 Bernoulli Processes......Page 189
6.4.1 Poisson Approximation to Binomial Probabilities......Page 190
6.4.2 Approximation to the Negative Binomial......Page 191
6.4.3 Poisson Random Variables......Page 192
6.5 More About Expectation......Page 193
6.6.1 Expectations of Functions......Page 196
6.6.2 Variance......Page 198
6.6.3 Variances of Some Families......Page 199
6.7.1 Estimating Binomial p......Page 201
6.7.2 Confidence Bounds for Binomial p......Page 202
6.7.3 Confidence Intervals......Page 203
6.7.4 Two-Sided Hypothesis Tests......Page 204
6.8 The Poisson Limit of the Negative Hypergeometric Family*......Page 205
6.9 Summary......Page 207
6.10 Exercises......Page 208
6.11 Supplementary Exercises......Page 212
7.1 Introduction......Page 215
7.2.1 Multinomial Random Vectors......Page 216
7.2.2 Marginal and Conditional Distributions......Page 217
7.3.1 Random Coordinates......Page 220
7.3.2 Multivariate Cumulative Distribution Functions......Page 222
7.4.1 Independence and Random Samples......Page 224
7.4.2 Sums of Random Vectors......Page 225
7.4.3 Convolutions......Page 226
7.5.2 Conditional Expectations......Page 227
7.5.3 Regression......Page 228
7.5.4 Linear Regression......Page 229
7.5.5 Covariance......Page 231
7.5.6 The Correlation Coefficient......Page 232
7.6.1 Expectations and Variances......Page 233
7.6.2 The Covariance Matrix......Page 234
7.6.4 Statistical Properties of Sample Means and Variances......Page 235
7.6.5 The Method of Indicators......Page 237
7.7.2 Markov's Inequality......Page 239
7.7.3 Convergence in Mean Squared Error......Page 240
7.8.1 Parameters in Models as Random Variables......Page 241
7.8.2 An Example of Bayesian Inference......Page 242
7.9 Summary......Page 243
7.10 Exercises......Page 244
7.11 Supplementary Exercises......Page 248
8.1 Introduction......Page 250
8.2.1 Posterior Probability of a Parameter Value......Page 251
8.2.2 Maximum Likelihood......Page 252
8.3.1 Ratio of the Maximum Likelihood to a Hypothetical Likelihood......Page 254
8.3.2 G-Squared......Page 255
8.4.1 Chi-Squared......Page 256
8.4.2 Comparing the Two Statistics......Page 257
8.4.4 Multinomial Models......Page 258
8.5.1 Conditions for a Maximum......Page 259
8.5.2 Proportional Fitting......Page 261
8.5.3 Iterative Proportional Fitting*......Page 262
8.5.4 Why Does It Work?*......Page 265
8.6.1 Relative G-Squared......Page 266
8.6.2 An ANOVA-like Table......Page 267
8.7.2 General Logistic Regression......Page 269
8.8.1 Linear Approximation to a Root......Page 271
8.8.2 Dose–Response with Historical Controls......Page 272
8.9 Summary......Page 273
8.10 Exercises......Page 274
8.11 Supplementary Exercises......Page 276
9.1 Introduction......Page 279
9.2.2 Continuous Variables......Page 280
9.3.1 How Would It Look?......Page 281
9.3.2 How to Construct a Poisson Process......Page 282
9.3.3 Spacings Between Events......Page 284
9.3.4 Gamma Variables......Page 285
9.3.5 Poisson Process as the Limit of a Hypergeometric Process*......Page 286
9.4.1 Transforming Variables......Page 288
9.4.2 Gamma Densities......Page 289
9.4.3 General Properties......Page 290
9.4.4 Interpretation......Page 292
9.5.1 Order Statistics......Page 295
9.5.2 Dirichlet Processes......Page 296
9.5.3 Beta Variables......Page 297
9.5.4 Beta Densities......Page 299
9.5.5 Connections......Page 300
9.6.1 Hypothesis Tests and Parameter Estimates......Page 302
9.6.2 Confidence Intervals......Page 303
9.6.3 Inferences About the Shape Parameter......Page 304
9.7.1 Alternative Hypotheses......Page 305
9.7.2 Most Powerful Tests......Page 306
9.8 Summary......Page 308
9.9 Exercises......Page 309
9.10 Supplementary Exercises......Page 311
10.1 Introduction......Page 313
10.2.2 Quantile Functions in General......Page 314
10.2.3 Continuous Quantile Functions......Page 316
10.3.1 Expectation as the Integral of a Quantile Function......Page 317
10.3.2 Markov's Inequality Revisited......Page 320
10.4.1 Changing Variables in a Density......Page 321
10.4.2 Expectation in Terms of a Density......Page 322
10.5.1 Shape of a Gamma Density......Page 324
10.5.2 Quadratic Approximation to the Log-Density......Page 325
10.5.3 Standard Normal Density......Page 328
10.5.5 Approximate Gamma Probabilities......Page 330
10.5.6 Computing Normal Probabilities......Page 331
10.5.7 Normal Tail Probabilities......Page 332
10.6.1 Dual Probabilities......Page 333
10.6.2 Continuity Correction......Page 335
10.7.1 The Normal Family......Page 336
10.7.2 Approximate Poisson Intervals......Page 337
10.7.3 Approximate Gamma Intervals......Page 338
10.9 Exercises......Page 339
10.10 Supplementary Exercises......Page 342
11.1 Introduction......Page 344
11.2.2 The General Case......Page 345
11.3.1 Two Order Statistics at Once......Page 346
11.3.2 Joint Density of Two Order Statistics......Page 347
11.3.3 Joint Densities in General......Page 348
11.3.4 The Family of Divisions of an Interval......Page 349
11.4.1 Affine Multivariate Transformations......Page 350
11.4.2 Dirichlet Densities......Page 352
11.4.3 Some Properties of Dirichlet Variables......Page 353
11.4.4 General Change of Variables......Page 355
11.5.1 Gammas Conditioned on Their Sum......Page 356
11.5.3 Gamma Densities in General......Page 357
11.5.4 Chi-Squared Variables......Page 359
11.6.1 Bayes's Theorem Revisited......Page 360
11.6.2 Application to Gamma Observations......Page 361
11.7.2 Linear Combinations of Normal Variables......Page 363
11.7.4 Approximating a Beta Variable......Page 365
11.8.1 Binomial Variables with Large Variance......Page 366
11.8.2 Negative Binomial Variables with Small Coefficient of Variation......Page 367
11.9.1 Approximating Two Order Statistics......Page 368
11.9.2 Correlated Normal Variables......Page 369
11.10.1 Family Relationships......Page 370
11.10.2 Asymptotic Normality......Page 371
11.11 Summary......Page 372
11.12 Exercises......Page 373
11.13 Supplementary Exercises......Page 375
12.1 Introduction......Page 377
12.2.1 A Probability Model for Errors......Page 378
12.2.2 Statistics of Fit for the Error Model......Page 379
12.3.1 Independence Models for Errors......Page 380
12.3.2 Distribution of R-squared......Page 381
12.3.3 Elementary Errors......Page 382
12.4.1 Continuous Likelihoods......Page 383
12.4.2 Maximum Likelihood with Normal Errors......Page 384
12.4.3 Unbiased Variance Estimates......Page 385
12.5.1 When the Variance Is Known......Page 386
12.5.2 When the Variance Is Unknown......Page 387
12.6.1 Matrix Form......Page 388
12.6.2 Centered Form......Page 389
12.6.3 Least-Squares Estimates......Page 390
12.6.4 Homoscedastic Errors......Page 391
12.6.5 Linear Combinations of Parameters......Page 393
12.7.2 Gauss–Markov Theorem......Page 394
12.8.1 The Score Estimator......Page 395
12.8.2 How Good Is It?......Page 397
12.8.3 The Information Inequality......Page 398
12.9 Summary......Page 400
12.10 Exercises......Page 401
12.11 Supplementary Exercises......Page 402
13.1 Introduction......Page 405
13.2.2 The P.G.F. Representation......Page 406
13.2.3 The P.G.F. As an Expectation......Page 408
13.2.4 Applications to Compound Variables......Page 409
13.2.5 Factorial Moments......Page 411
13.3.1 Comparison with Exponential Variables......Page 412
13.3.2 The M.G.F. as an Expectation......Page 414
13.4.1 Poisson Limits......Page 415
13.4.2 Law of Large Numbers......Page 416
13.4.3 Normal Limits......Page 417
13.4.4 A Central Limit Theorem......Page 418
13.5.1 Natural Exponential Forms......Page 420
13.5.2 Expectations......Page 421
13.5.3 Natural Parameters......Page 422
13.5.5 Other Sufficient Statistics......Page 423
13.6.1 Conditional Improvement......Page 424
13.6.2 Sufficient Statistics......Page 426
13.7.1 Tail Probability Approximation......Page 427
13.7.2 Tilting a Random Variable......Page 428
13.7.3 Normal Tail Approximation......Page 429
13.7.4 Poisson Tail Approximations......Page 431
13.7.5 Small-Sample Asymptotics......Page 432
13.9 Exercises......Page 433
13.10 Supplementary Exercises......Page 436
B......Page 446
D......Page 447
G......Page 448
L......Page 449
N......Page 450
P......Page 451
S......Page 452
U......Page 453
W......Page 454