Virtually any random process developing chronologically can be viewed as a time series. In economics closing prices of stocks, the cost of money, the jobless rate, and retail sales are just a few examples of many. Developed from course notes and extensively classroom-tested, Applied Time Series Analysis with R, Second Edition includes examples across a variety of fields, develops theory, and provides an R-based software package to aid in addressing time series problems in a broad spectrum of fields. The material is organized in an optimal format for graduate students in statistics as well as in the natural and social sciences to learn to use and understand the tools of applied time series analysis.
Features
Gives readers the ability to actually solve significant real-world problems
Addresses many types of nonstationary time series and cutting-edge methodologies
Promotes understanding of the data and associated models rather than viewing it as the output of a "black box"
Provides the R package tswge available on CRAN which contains functions and over 100 real and simulated data sets to accompany the book. Extensive help regarding the use of tswge functions is provided in appendices and on an associated website.
Over 150 exercises and extensive support for instructors
The second edition includes additional real-data examples, uses R-based code that helps students easily analyze data, generate realizations from models, and explore the associated characteristics. It also adds discussion of new advances in the analysis of long memory data and data with time-varying frequencies (TVF).
Author(s): Wayne A. Woodward, Henry L. Gray, Alan C. Elliott
Series: Statistics
Edition: 2nd
Publisher: CRC Press
Year: 2016
Language: English
Pages: 752
Tags: Time Series, R
Half Title......Page 2
Title Page......Page 3
Copyright Page......Page 4
Table of Contents......Page 6
Preface for Second Edition......Page 15
Acknowledgments......Page 17
1 Stationary Time Series......Page 18
1.1 Time Series......Page 19
1.2 Stationary Time Series......Page 22
1.3 Autocovariance and Autocorrelation Functions for Stationary Time Series......Page 24
1.4 Estimation of the Mean, Autocovariance, and Autocorrelation for Stationary Time Series......Page 29
1.4.1.1 Ergodicity of X......Page 31
1.4.1.2 Variance of X......Page 37
1.4.2 Estimation of γk......Page 38
1.4.3 Estimation of ρk......Page 40
1.5 Power Spectrum......Page 42
1.6 Estimating the Power Spectrum and Spectral Density for Discrete Time Series......Page 54
1.7 Time Series Examples......Page 59
1.7.1 Simulated Data......Page 60
1.7.2 Real Data......Page 65
Appendix 1A: Fourier Series......Page 71
Appendix 1B: R Commands......Page 72
Exercises......Page 77
2.1 Introduction to Linear Filters......Page 86
2.2 Stationary General Linear Processes......Page 88
2.4 Filtering Applications......Page 91
2.3 Wold Decomposition Theorem......Page 90
2.4.1 Butterworth Filters......Page 96
Appendix 2A: Theorem Poofs......Page 103
Appendix 2B: R Commands......Page 106
Exercises......Page 107
3.1 MA Processes......Page 110
3.1.1 MA(1) Model......Page 113
3.2 AR Processes......Page 116
3.2.1 Inverting the Operator......Page 122
3.2.2 AR(1) Model......Page 123
3.2.3 AR(p) Model for p ≥ 1......Page 131
3.2.4 Autocorrelations of an AR(p) Model......Page 132
3.2.5 Linear Difference Equations......Page 134
3.2.6 Spectral Density of an AR(p) Model......Page 136
3.2.7.1 Autocorrelations of an AR(2) Model......Page 137
3.2.7.3 Stationary/Causal Region of an AR(2)......Page 141
3.2.7.4 ψ-Weights of an AR(2) Model......Page 142
3.2.8 Summary of AR(1) and AR(2) Behavior......Page 152
3.2.9 AR(p) Model......Page 154
3.2.10 AR(1) and AR(2) Building Blocks of an AR(p) Model......Page 158
3.2.11 Factor Tables......Page 160
3.2.12 Invertibility/Infinite-Order AR Processes......Page 169
3.2.13 Two Reasons for Imposing Invertibility......Page 170
3.3 ARMA Processes......Page 173
3.3.2 Spectral Density of an ARMA(p,q) Model......Page 176
3.3.3 Factor Tables and ARMA(p,q) Models......Page 177
3.3.4 Autocorrelations of an ARMA(p,q) Model......Page 181
3.3.5 ψ-Weights of an ARMA(p,q)......Page 186
3.3.6 Approximating ARMA(p,q) Processes Using High-Order AR(p) Models......Page 188
3.4 Visualizing AR Components......Page 187
3.5 Seasonal ARMA(p,q) × (PS,QS)S Models......Page 192
3.6 Generating Realizations from ARMA(p,q) Processes......Page 198
3.6.1 MA(q) Model......Page 199
3.7 Transformations......Page 200
3.7.1 Memoryless Transformations......Page 202
3.7.2 AR Transformations......Page 203
Appendix 3A: Proofs of Theorems......Page 207
Appendix 3B: R Commands......Page 212
Exercises......Page 219
4.1 Stationary Harmonic Models......Page 229
4.1.1 Pure Harmonic Models......Page 231
4.1.2 Harmonic Signal-Plus-Noise Models......Page 233
4.1.3 ARMA Approximation to the Harmonic Signal-Plus-Noise Model......Page 235
4.2 ARCH and GARCH Processes......Page 239
4.2.1.1 The ARCH(1) Model......Page 242
4.2.1.2 The ARCH(q0) Model......Page 246
4.2.2 The GARCH(p0, q0) Process......Page 247
4.2.3 AR Processes with ARCH or GARCH Noise......Page 249
Appendix 4A: R Commands......Page 251
Exercises......Page 254
5.1 Deterministic Signal-Plus-Noise Models......Page 256
5.1.1 Trend-Component Models......Page 257
5.1.2 Harmonic Component Models......Page 259
5.2 ARIMA(p,d,q) and ARUMA(p,d,q) Processes......Page 260
5.2.1 Extended Autocorrelations of an ARUMA(p,d,q) Process......Page 263
5.3 Multiplicative Seasonal ARUMA (p,d,q) × (Ps, Ds, Qs)s Process......Page 270
5.3.1 Factor Tables for Seasonal Models of the Form of Equation 5.17 with s = 4 and s = 12......Page 272
5.4 Random Walk Models......Page 273
5.4.1 Random Walk......Page 274
5.5 G-Stationary Models for Data with Time-Varying Frequencies......Page 275
Appendix 5A: R Commands......Page 276
Exercises......Page 280
6.1 Mean-Square Prediction Background......Page 284
6.2 Box–Jenkins Forecasting for ARMA(p,q) Models......Page 286
6.2.1 General Linear Process Form of the Best Forecast Equation......Page 289
6.3 Properties of the Best Forecast Xt0 (l)......Page 288
6.4 π-Weight Form of the Forecast Function......Page 290
6.5 Forecasting Based on the Difference Equation......Page 291
6.5.1 Difference Equation Form of the Best Forecast Equation......Page 293
6.5.2 Basic Difference Equation Form for Calculating Forecasts from an ARMA(p,q) Model......Page 294
6.6 Eventual Forecast Function......Page 298
6.7 Assessing Forecast Performance......Page 299
6.7.1 Probability Limits for Forecasts......Page 301
6.7.2 Forecasting the Last k Values......Page 305
6.8 Forecasts Using ARUMA(p,d,q) Models......Page 306
6.9 Forecasts Using Multiplicative Seasonal ARUMA Models......Page 316
6.10 Forecasts Based on Signal-Plus-Noise Models......Page 320
Appendix 6A: Proof of Projection Theorem......Page 324
Appendix 6B: Basic Forecasting Routines......Page 326
Exercises......Page 331
7.2 Preliminary Estimates......Page 336
7.2.1.1 Yule–Walker Estimates......Page 337
7.2.1.2 Least Squares Estimation......Page 340
7.2.1.3 Burg Estimates......Page 342
7.2.2.1 MM Estimation for an MA(q)......Page 345
7.2.2.2 MA(q) Estimation Using the Innovations Algorithm......Page 347
7.2.3.1 Extended Yule–Walker Estimates of the AR Parameters......Page 349
7.2.3.2 Tsay–Tiao Estimates of the AR Parameters......Page 350
7.2.3.3 Estimating the MA Parameters......Page 352
7.3 ML Estimation of ARMA(p,q) Parameters......Page 351
7.3.1 Conditional and Unconditional ML Estimation......Page 353
7.4 Backcasting and Estimating σ2a......Page 358
7.5 Asymptotic Properties of Estimators......Page 362
7.5.1 AR Case......Page 363
7.5.1.1 Confidence Intervals: AR Case......Page 365
7.5.2 ARMA(p,q) Case......Page 366
7.5.2.1 Confidence Intervals for ARMA(p,q) Parameters......Page 369
7.5.3 Asymptotic Comparisons of Estimators for an MA(1)......Page 371
7.6 Estimation Examples Using Data......Page 373
7.7 ARMA Spectral Estimation......Page 380
7.8 ARUMA Spectral Estimation......Page 385
Appendix......Page 387
Exercises......Page 390
8.1 Preliminary Check for White Noise......Page 393
8.2 Model Identification for Stationary ARMA Models......Page 396
8.2.1 Model Identification Based on AIC and Related Measures......Page 397
8.3 Model Identification for Nonstationary ARUMA(p,d,q) Models......Page 401
8.3.1 Including a Nonstationary Factor in the Model......Page 403
8.3.2 Identifying Nonstationary Component(s) in a Model......Page 404
8.3.3 Decision Between a Stationary or a Nonstationary Model......Page 409
8.3.4 Deriving a Final ARUMA Model......Page 410
8.3.5.1 Including a Factor (1 − B)d in the Model......Page 413
8.3.5.2 Testing for a Unit Root......Page 417
8.3.5.3 Including a Seasonal Factor (1 − Bs) in the Model......Page 420
Appendix 8A: Model Identification Based on Pattern Recognition......Page 431
Appendix 8B: Model Identification Functions in tswge......Page 450
Exercises......Page 455
9.1 Residual Analysis......Page 459
9.1.2 Ljung–Box Test......Page 460
9.1.3 Other Tests for Randomness......Page 461
9.2 Stationarity versus Nonstationarity......Page 465
9.3 Signal-Plus-Noise versus Purely Autocorrelation-Driven Models......Page 472
9.3.1 Cochrane–Orcutt and Other Methods......Page 474
9.3.2 A Bootstrapping Approach......Page 475
9.4 Checking Realization Characteristics......Page 476
9.5 Comprehensive Analysis of Time Series Data: A Summary......Page 482
Appendix 9A: R Commands......Page 483
Exercises......Page 485
10.1 Multivariate Time Series Basics......Page 488
10.2 Stationary Multivariate Time Series......Page 490
10.3 Multivariate (Vector) ARMA Processes......Page 496
10.2.1.1 Estimating μ......Page 495
10.3.1 Forecasting Using VAR(p) Models......Page 505
10.3.3.2 Least Squares and Conditional ML Estimation......Page 509
10.3.3.1 Yule–Walker Estimation......Page 508
10.3.3.3 Burg-Type Estimation......Page 510
10.3.6.1 Model Selection......Page 511
10.3.6.3 Testing the Residuals for White Noise......Page 512
10.4 Nonstationary VARMA Processes......Page 514
10.5 Testing for Association between Time Series......Page 515
10.5.1 Testing for Independence of Two Stationary Time Series......Page 518
10.5.2 Testing for Cointegration between Nonstationary Time Series......Page 522
10.6.2 Observation Equation......Page 525
10.6.3 Goals of State-Space Modeling......Page 528
10.6.4.3 Smoothing Using the Kalman Filter......Page 529
10.6.4.4 h-Step Ahead Predictions......Page 530
10.6.5 Kalman Filter and Missing Data......Page 534
10.6.6 Parameter Estimation......Page 537
10.6.7.1 Revised State-Space Model......Page 538
10.6.7.3 Ψj Complex......Page 539
Appendix 10A: Derivation of State-Space Results......Page 541
Appendix 10B: Basic Kalman Filtering Routines......Page 548
Exercises......Page 552
11.1 Long Memory......Page 556
11.2 Fractional Difference and FARMA Processes......Page 557
11.3 Gegenbauer and GARMA Processes......Page 566
11.3.2 Gegenbauer Process......Page 568
11.3.3 GARMA Process......Page 572
11.4 k-Factor Gegenbauer and GARMA Processes......Page 576
11.4.1 Calculating Autocovariances......Page 581
11.5 Parameter Estimation and Model Identification......Page 583
11.6 Forecasting Based on the k-Factor GARMA Model......Page 589
11.7 Testing for Long Memory......Page 591
11.7.1 Testing for Long Memory in the Fractional and FARMA Setting......Page 593
11.8 Modeling Atmospheric CO2 Data Using Long-Memory Models......Page 594
Appendix 11A: R Commands......Page 597
Exercises......Page 606
12.1 Shortcomings of Traditional Spectral Analysis for TVF Data......Page 609
12.2 Window-Based Methods that Localize the “Spectrum” in Time......Page 611
12.2.1 Gabor Spectrogram......Page 613
12.2.2 Wigner–Ville Spectrum......Page 616
12.3 Wavelet Analysis......Page 615
12.3.2 Wavelet Analysis Introduction......Page 617
12.3.3 Fundamental Wavelet Approximation Result......Page 622
12.3.4 Discrete Wavelet Transform for Data Sets of Finite Length......Page 624
12.3.5 Pyramid Algorithm......Page 628
12.3.6 Multiresolution Analysis......Page 629
12.3.7 Wavelet Shrinkage......Page 635
12.3.8 Scalogram: Time-Scale Plot......Page 638
12.3.9 Wavelet Packets......Page 643
12.3.10 Two-Dimensional Wavelets......Page 650
12.4 Concluding Remarks on Wavelets......Page 653
Appendix 12A: Mathematical Preliminaries for This Chapter......Page 654
Appendix 12B: Mathematical Preliminaries......Page 658
Exercises......Page 662
13.1 Generalized-Stationary Processes......Page 665
13.2 M-Stationary Processes......Page 666
13.2.1 Continuous M-Stationary Process......Page 667
13.2.2 Discrete M-Stationary Process......Page 669
13.2.3 Discrete Euler(p) Model......Page 670
13.2.4 Time Transformation and Sampling......Page 671
13.3 G(λ)-Stationary Processes......Page 674
13.3.1 Continuous G(p; λ) Model......Page 678
13.3.2 Sampling the Continuous G(λ)-Stationary Processes......Page 680
13.3.2.1 Equally Spaced Sampling from G(p; λ) Processes......Page 679
13.3.3 Analyzing TVF Data Using the G(p; λ) Model......Page 682
13.3.3.1 G(p; λ) Spectral Density......Page 684
13.4 Linear Chirp Processes......Page 696
13.4.1 Models for Generalized Linear Chirps......Page 700
13.5 G-Filtering......Page 704
13.6 Concluding Remarks......Page 707
Appendix 13A: G-Stationary Basics......Page 709
Appendix 13B: R Commands......Page 713
Exercises......Page 719
References......Page 721
Index......Page 730