1st Edition

Time Series A First Course with Bootstrap Starter

    586 Pages
    by Chapman & Hall

    586 Pages
    by Chapman & Hall

    Time Series: A First Course with Bootstrap Starter provides an introductory course on time series analysis that satisfies the triptych of (i) mathematical completeness, (ii) computational illustration and implementation, and (iii) conciseness and accessibility to upper-level undergraduate and M.S. students. Basic theoretical results are presented in a mathematically convincing way, and the methods of data analysis are developed through examples and exercises parsed in R. A student with a basic course in mathematical statistics will learn both how to analyze time series and how to interpret the results.





    The book provides the foundation of time series methods, including linear filters and a geometric approach to prediction. The important paradigm of ARMA models is studied in-depth, as well as frequency domain methods. Entropy and other information theoretic notions are introduced, with applications to time series modeling. The second half of the book focuses on statistical inference, the fitting of time series models, as well as computational facets of forecasting. Many time series of interest are nonlinear in which case classical inference methods can fail, but bootstrap methods may come to the rescue. Distinctive features of the book are the emphasis on geometric notions and the frequency domain, the discussion of entropy maximization, and a thorough treatment of recent computer-intensive methods for time series such as subsampling and the bootstrap. There are more than 600 exercises, half of which involve R coding and/or data analysis. Supplements include a website with 12 key data sets and all R code for the book's examples, as well as the solutions to exercises.

    1. Introduction
       Time Series Data
       Cycles in Time Series Data
       Spanning and Scaling Time Series
       Time Series Regression and Autoregression
       Overview
       Exercises

    2. The Probabilistic Structure of Time Series
       Random Vectors
       Time Series and Stochastic Processes
       Marginals and Strict Stationarity
       Autocovariance and Weak Stationarity
       Illustrations of Stochastic Processes
       Three Examples of White Noise
       Overview
       Exercises

    3. Trends, Seasonality, and Filtering
       Nonparametric Smoothing
       Linear Filters and Linear Time Series
       Some Common Types of Filters
       Trends
       Seasonality
       Trend and Seasonality Together
       Integrated Processes
       Overview
       Exercises

    4. The Geometry of Random Variables
       Vector Space Geometry and Inner Products
       L2(; P;F): The Space of Random Variables with Finite Second Moment
       Hilbert Space Geometry
       Projection in Hilbert Space
       Prediction of Time Series
       Linear Prediction of Time Series
       Orthonormal Sets and Infinite Projection
       Projection of Signals
       Overview
       Exercises

    5. ARMA Models with White Noise Residuals
       Definition of the ARMA Recursion
       Difference Equations
       Stationarity and Causality of the AR(1)
       Causality of ARMA Processes
       Invertibility of ARMA Processes
       The Autocovariance Generating Function
       Computing ARMA Autocovariances via the MA Representation
       Recursive Computation of ARMA Autocovariances
       Overview
       Exercises

    6. Time Series in the Frequency Domain
       The Spectral Density
       Filtering in the Frequency Domain
       Inverse Autocovariances
       Spectral Representation of Toeplitz Covariance Matrices
       Partial Autocorrelations
       Application to Model Identification
       Overview
       Exercises

    7. The Spectral Representation
       The Herglotz Theorem
       The Discrete Fourier Transform
       The Spectral Representation
       Optimal Filtering
       Kolmogorov's Formula
       The Wold Decomposition
       Spectral Approximation and the Cepstrum
       Overview
       Exercises

    8. Information and Entropy
       Introduction
       Events and Information Sets
       Maximum Entropy Distributions
       Entropy in Time Series
       Markov Time Series
       Modeling Time Series via Entropy
       Relative Entropy and Kullback-Leibler Discrepancy
       Overview
       Exercises

    9. Statistical Estimation
       Weak Correlation and Weak Dependence
       The Sample Mean
       CLT for Weakly Dependent Time Series
       Estimating Serial Correlation
       The Sample Autocovariance
       Spectral Means
       Statistical Properties of the Periodogram
       Spectral Density Estimation
       Refinements of Spectral Analysis
       Overview
       Exercises

    10. Fitting Time Series Models
        MA Model Identification
        EXP Model Identification
        AR Model Identification
        Optimal Prediction Estimators
        Relative Entropy Minimization
        Computation of Optimal Predictors
        Computation of the Gaussian Likelihood
        Model Evaluation
        Model Parsimony and Information Criteria
        Model Comparisons
        Iterative Forecasting
        Applications to Imputation and Signal Extraction
        Overview
        Exercises

    11. Nonlinear Time Series Analysis
        Types of Nonlinearity
        The Generalized Linear Process
        The ARCH Model
        The GARCH Model
        The Bi-spectral Density
        Volatility Filtering
        Overview
        Exercises

    12. The Bootstrap
        Sampling Distributions of Statistics
        Parameters as Functionals and Monte Carlo
        The Plug-in Principle and the Bootstrap
        Model-based Bootstrap and Residuals
        Sieve Bootstraps
        Time Frequency Toggle Bootstrap
        Subsampling
        Block Bootstrap Methods
        Overview
        Exercises

    A. Probability
       Probability Spaces
       Random Variables
       Expectation and Variance
       Joint Distributions
       The Normal Distribution
       Exercises

    B. Mathematical Statistics
       Data
       Sampling Distributions
       Estimation
       Inference
       Con_dence Intervals
       Hypothesis Testing
       Exercises

    C. Asymptotics
       Convergence Topologies
       Convergence Results for Random Variables
       Asymptotic Distributions
       Central Limit Theory for Time Series
       Exercises

    D. Fourier Series
       Complex Random Variables
       Trigonometric Polynomials

    E. Stieltjes Integration
       Deterministic Integration
       Stochastic Integration

    Biography

    Tucker S. McElroy is Senior Time Series Mathematical Statistician at the U.S. Census Bureau, where he has contributed to developing time series research and software for the last 15 years. He has published more than 80 papers and is a recipient of the Arthur S. Flemming award (2011).





    Dimitris N. Politis is Distinguished Professor of Mathematics at the University of California at San Diego, where he is also serving as Associate Director of the Halıcıoğlu Data Science Institute. He has co-authored two research monographs and more than 100 journal papers. He is a recipient of the Tjalling C. Koopmans Econometric Theory Prize (2009-2011) and is Co-Editor of the Journal of Time Series Analysis.



    "The authors should be congratulated for providing many concise and compact proofs for various technical assertions in time series. (There are many seemingly inconspicuous but intriguing technical details in time series!) The authors' strength and perhaps also their preference in frequency domain methods are well-reflected in the treatments in Chapters 6, 7 and 9, and also some parts of Chapters 10 and 11. Chapter 12 introduces several of the most popular bootstrap methods for time series, including AR-sieve bootstrap, block bootstrap and frequency domain bootstrap. In terms of the mathematical level, the book is for students with a solid mathematical background. The style of the presentation would also better suit courses offered in statistics, mathematics or engineering programmes for which spectral analysis is pertinent."
    ~International Statistical Review

    "The first eight chapters of this book mainly focus on understanding the structure of time series. From the ninth chapter onwards, they discuss statistical inference based on time series data…Since the book includes a large number of exercises, teachers of a course on time series may find this book useful. Overall, researchers working in the area of time series may also find this book a useful reference. Finally, applied researchers involved with time series data may also find this book helpful." ~ISCB News

    "This new monograph by McElroy (US Census Bureau) and Politis (Univ. of California, San Diego) is a timely publication, whereas the more well-known time series monographs were published long ago (in the 1980s and 1990s).. this volume stands out as an ideal source for readers exploring time series analysis both theoretically and empirically…Some unique topics are introduced, for example, information entropy in time series, time-series-specific statistical inference, and dependent data bootstrapping. The latter represents an important recent advancement in time series analysis."
    ~CHOICE