The jackknife and the bootstrap are nonparametric methods for assessing the errors in a statistical estimation problem. They provide several advantages over the traditional parametric approach: the methods are easy to describe and they apply to arbitrarily complicated situations; distribution assumptions, such as normality, are never made.
This monograph connects the jackknife, the bootstrap, and many other related ideas such as cross-validation, random subsampling, and balanced repeated replications into a unified exposition. The theoretical development is at an easy mathematical level and is supplemented by a large number of numerical examples.
The methods described in this monograph form a useful set of tools for the applied statistician. They are particularly useful in problem areas where complicated data structures are common, for example, in censoring, missing data, and highly multivariate situations.
Author(s): Bradley Efron
Series: CBMS-NSF Regional conference series in applied mathematics 38
Publisher: Society for Industrial and Applied Mathematics
Year: 1987
Language: English
Pages: 103
City: Philadelphia, Pa
The Jackknife,the Bootstrap and Other Resampling Plans......Page 1
Contents......Page 5
Preface......Page 9
CHAPTER 1 Introduction......Page 11
CHAPTER 2 The Jackknife Estimate of Bias......Page 15
CHAPTER 3 The Jackknife Estimate of Variance......Page 23
CHAPTER 4 Bias of the Jackknife Variance Estimate......Page 31
CHAPTER 5 The Bootstrap......Page 37
CHAPTER 6 The Infinitesimal Jackknife, the Delta Method and theInfluence Function......Page 47
CHAPTER 7 Cross Validation, the Jackknife and the Bootstrap......Page 59
CHAPTER 8 Balanced Repeated Replications(Half-sampling)......Page 71
CHAPTER 9 Random Subsampling......Page 79
CHAPTER 10 Nonparametric Confidence Intervals......Page 85
References......Page 101