Introduction to time series modeling, no index

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, Introduction to Time Series Modeling covers numerous time series models and the various tools for handling them.

The book employs the state-space model as a generic tool for time series modeling and presents convenient recursive filtering and smoothing methods, including the Kalman filter, the non-Gaussian filter, and the sequential Monte Carlo filter, for the state-space models. Taking a unified approach to model evaluation based on the entropy maximization principle advocated by Dr. Akaike, the author derives various methods of parameter estimation, such as the least squares method, the maximum likelihood method, recursive estimation for state-space models, and model selection by the Akaike information criterion (AIC). Along with simulation methods, he also covers standard stationary time series models, such as AR and ARMA models, as well as nonstationary time series models, including the locally stationary AR model, the trend model, the seasonal adjustment model, and the time-varying coefficient AR model.

With a focus on the description, modeling, prediction, and signal extraction of times series, this book provides basic tools for analyzing time series that arise in real-world problems. It encourages readers to build models for their own real-life problems.

Author(s): Genshiro Kitagawa
Series: Chapman & Hall/CRC Monographs on Statistics & Applied Probability
Publisher: CRC
Year: 2010

Language: English
Pages: 305

Cover Page......Page 1
MONOGRAPHS ON STATISTICS AND APPLIED PROBABILITY......Page 3
Title Page......Page 6
ISBN 1584889217......Page 7
Preface......Page 9
List of Figures......Page 11
List of Tables......Page 16
4 Statistical Modeling......Page 18
7 Estimation of an AR Model......Page 19
13 Time-Varying Coef cient AR Model......Page 20
A Algorithms for Nonlinear Optimization 249 B Derivation of Levinson’s Algorithm......Page 21
Answers to the Problems, Bibliography, Index......Page 22
1.1 Time Series Data......Page 23
1.2 Classification of Time Series......Page 28
1.4 Pre-processing of Time Series......Page 30
1.4.2 Differencing......Page 31
1.4.3 Change from the previous month (quarter) and annual change......Page 32
1.4.4 Moving average......Page 33
1.5 Organization of This Book......Page 35
2.1 The Distribution of Time Series and Stationarity......Page 39
2.2 The Autocovariance Function of Stationary Time Series......Page 42
2.3 Estimation of the Autocovariance Function......Page 43
2.4 Multivariate Time Series and Scatterplots......Page 46
2.5 Cross-Covariance Function and Cross-Correlation Function......Page 48
3.1 The Power Spectrum......Page 53
3.2 The Periodogram......Page 58
3.3 Averaging and Smoothing of the Periodogram......Page 62
3.5 Computation of the Periodogram by Fast Fourier Transform......Page 66
4.1 Probability Distributions and Statistical Models......Page 71
4.2 K-L Information and the Entropy Maximization Principle......Page 76
4.3 Estimation of the K-L Information and Log-Likelihood......Page 78
4.4 Estimation of Parameters by the Maximum Likelihood Method......Page 80
4.5 AIC (Akaike Information Criterion)......Page 84
4.5.1 Evaluation of C1......Page 86
4.5.2 Evaluation of C3......Page 87
4.6 Transformation of Data......Page 88
5.1 Regression Models and the Least Squares Method......Page 93
5.2 Householder Transformation......Page 95
5.3 Selection of Order by AIC......Page 97
5.4 Addition of Data and Successive Householder Reduction......Page 100
5.5 Variable Selection by AIC......Page 101
6.1 ARMA Model......Page 105
6.2 The Impulse Response Function......Page 106
6.3 The Autocovariance Function......Page 107
6.5 The Power Spectrum of the ARMA Process......Page 110
6.6 The Characteristic Equation......Page 114
6.7 The Multivariate AR Model......Page 115
7.1 Fitting an AR Model......Page 125
7.2 Yule-Walker Method and Levinson's Algorithm......Page 127
7.3 Estimation of an AR Model by the Least Squares Method......Page 128
7.4 Estimation of an AR Model by the PARCOR Method......Page 130
7.5 Large Sample Distribution of the Estimates......Page 133
7.6 Estimation of a Multivariate AR Model by the Yule-Walker Method......Page 135
7.7 Estimation of a Multivariate AR Model by the Least Squares Method......Page 139
8.1 Locally Stationary AR Model......Page 145
8.2 Automatic Partitioning of the Time Interval into an Arbitrary Number of Subintervals......Page 147
8.3 Precise Estimation of a Change Point......Page 152
9.1 The State-Space Model......Page 157
9.2 State Estimation via the Kalman Filter......Page 160
9.4 Increasing Horizon Prediction of the State......Page 162
9.5 Prediction of Time Series......Page 163
9.6 Likelihood Computation and Parameter Estimation for a Time Series Model......Page 166
9.7 Interpolation of Missing Observations......Page 169
10.1 State-Space Representation of the ARMA Model......Page 173
10.2 Initial State of an ARMA Model......Page 174
10.3 Maximum Likelihood Estimate of an ARMA Model......Page 175
10.4 Initial Estimates of Parameters......Page 176
11.1 The Polynomial Trend Model......Page 181
11.2 Trend Component Model–Model for Probabilistic Structural Changes......Page 184
11.3 Trend Model......Page 187
12.1 Seasonal Component Model......Page 195
12.2 Standard Seasonal Adjustment Model......Page 198
12.3 Decomposition Including an AR Component......Page 201
12.4 Decomposition Including a Trading-Day Effect......Page 206
13.1 Time-Varying Variance Model......Page 211
13.2 Time-Varying Coefficient AR Model......Page 214
13.3 Estimation of the Time-Varying Spectrum......Page 219
13.4 The Assumption on System Noise for the Time-Varying Coefficient AR Model......Page 220
13.5 Abrupt Changes of Coefficients......Page 221
14.1 Necessity of Non-Gaussian Models......Page 225
14.2 Non-Gaussian State-Space Models and State Estimation......Page 226
14.3 Numerical Computation of the State Estimation Formula......Page 228
14.4 Non-Gaussian Trend Model......Page 231
14.5 A Time-Varying Variance Model......Page 235
14.6.1 Processing of the outliers by a mixture of Gaussian distributions......Page 239
14.6.2 A nonstationary discrete process......Page 240
14.6.3 A direct method of estimating the time-varying variance......Page 241
15.1 The Nonlinear Non-Gaussian State-Space Model and Approximations of Distributions......Page 243
15.2.2 Filtering......Page 247
15.2.4 Likelihood of a model......Page 248
15.2.5 Re-sampling method......Page 249
15.2.6 Numerical examples......Page 250
15.3 Monte Carlo Smoothing Method......Page 253
15.4 Nonlinear Smoothing......Page 255
16.1 Generation of Uniform Random Numbers......Page 259
16.2 Generation of Gaussian White Noise......Page 261
16.3 Simulation Using a State-Space Model......Page 263
16.4 Simulation with Non-Gaussian Model......Page 265
16.4.3 Arbitrary distribution......Page 266
A. Algorithms for Nonlinear Optimization......Page 271
B. Derivation of Levinson's Algorithm......Page 273
C.1 Kalman Filter......Page 277
C.2 Smoothing......Page 278
D.1 One-Step-Ahead Prediction......Page 281
D.2 Filter......Page 282
D.3 Smoothing......Page 283
Answers to the Problems......Page 285
Bibliography......Page 299