1st Edition

Time Series Modeling of Neuroscience Data

By Tohru Ozaki Copyright 2012
    574 Pages 86 B/W Illustrations
    by CRC Press

    Recent advances in brain science measurement technology have given researchers access to very large-scale time series data such as EEG/MEG data (20 to 100 dimensional) and fMRI (140,000 dimensional) data. To analyze such massive data, efficient computational and statistical methods are required.

    Time Series Modeling of Neuroscience Data shows how to efficiently analyze neuroscience data by the Wiener-Kalman-Akaike approach, in which dynamic models of all kinds, such as linear/nonlinear differential equation models and time series models, are used for whitening the temporally dependent time series in the framework of linear/nonlinear state space models. Using as little mathematics as possible, this book explores some of its basic concepts and their derivatives as useful tools for time series analysis. Unique features include:

    • A statistical identification method of highly nonlinear dynamical systems such as the Hodgkin-Huxley model, Lorenz chaos model, Zetterberg Model, and more
    • Methods and applications for Dynamic Causality Analysis developed by Wiener, Granger, and Akaike
    • A state space modeling method for dynamicization of solutions for the Inverse Problems
    • A heteroscedastic state space modeling method for dynamic non-stationary signal decomposition for applications to signal detection problems in EEG data analysis
    • An innovation-based method for the characterization of nonlinear and/or non-Gaussian time series
    • An innovation-based method for spatial time series modeling for fMRI data analysis

    The main point of interest in this book is to show that the same data can be treated using both a dynamical system and time series approach so that the neural and physiological information can be extracted more efficiently. Of course, time series modeling is valid not only in neuroscience data analysis but also in many other sciences and engineering fields where the statistical inference from the observed time series data plays an important role.

    Introduction
    Time-Series Modeling
    Continuous-Time Models and Discrete-Time Models
    Unobserved Variables and State Space Modeling

    Dynamic Models for Time Series Prediction
    Time Series Prediction and the Power Spectrum

    Fantasy and Reality of Prediction Errors
    Power Spectrum of Time Series

    Discrete-Time Dynamic Models

    Linear Time Series Models
    Parametric Characterization of Power Spectra
    Tank Model and Introduction of Structural State Space Representation
    Akaike’s Theory of Predictor Space
    Dynamic Models with Exogenous Input Variables

    Multivariate Dynamic Models

    Multivariate AR Models
    Multivariate AR Models and Feedback Systems
    Multivariate ARMA Models
    Multivariate State Space Models and Akaike’s Canonical Realization
    Multivariate and Spatial Dynamic Models with Inputs

    Continuous-Time Dynamic Models

    Linear Oscillation Models
    Power Spectrum
    Continuous-Time Structural Modeling
    Nonlinear Differential Equation Models

    Some More Models

    Nonlinear AR Models
    Neural Network Models
    RBF-AR Models
    Characterization of Nonlinearities
    Hammerstein Model and RBF-ARX Model
    Discussion on Nonlinear Predictors
    Heteroscedastic Time Series Models

    Related Theories and Tools
    Prediction and Doob Decomposition

    Looking at the Time Series from Prediction Errors
    Innovations and Doob Decompositions
    Innovations and Doob Decomposition in Continuous Time

    Dynamics and Stationary Distributions

    Time Series and Stationary Distributions
    Pearson System of Distributions and Stochastic Processes
    Examples
    Different Dynamics Can Arise from the Same Distribution.

    Bridge between Continuous-Time Models and Discrete-Time Models

    Four Types of Dynamic Models
    Local Linearization Bridge
    LL Bridges for the Higher Order Linear/Nonlinear Processes
    LL Bridges for the Processes from the Pearson System
    LL Bridge as a Numerical Integration Scheme

    Likelihood of Dynamic Models

    Innovation Approach
    Likelihood for Continuous-Time Models
    Likelihood of Discrete-Time Models
    Computationally Efficient Methods and Algorithms
    Log-Likelihood and the Boltzmann Entropy

    State Space Modeling
    Inference Problem (a) for State Space Models

    State Space Models and Innovations
    Solutions by the Kalman Filter
    Nonlinear Kalman Filters
    Other Solutions
    Discussions

    Inference Problem (b) for State Space Models

    Introduction
    Log-Likelihood of State Space Models in Continuous Time
    Log-Likelihood of State Space Models in Discrete Time
    Regularization Approach and Type II Likelihood
    Identifiability Problems

    Art of Likelihood Maximization

    Introduction
    Initial Value Effects and the Innovation Likelihood
    Slow Convergence Problem
    Innovation-Based Approach versus Innovation-Free .Approach
    Innovation-Based Approach and the Local Levy State Space Models
    Heteroscedastic State Space Modeling

    Causality Analysis

    Introduction
    Granger Causality and Limitations
    Akaike Causality
    How to Define Pair-Wise Causality with Akaike Method
    Identifying Power Spectrum for Causality Analysis
    Instantaneous Causality
    Application to fMRI Data
    Discussions

    Conclusion: The New and Old Problems

    References
    Index

    Biography

    Tohru Ozaki is a mathematician and statistician. He received his BSc in mathematics from the University of Tokyo in 1969. He then joined the Institute of Statistical Mathematics (ISM), Tokyo, in 1970 and study and worked with Hirotugu Akaike. He received his DSc from Tokyo Institute of Technology in 1981 under the supervision of Akaike. From 1987 to 2008, he was a professor at ISM and, after Akaike’s retirement, served as the director of the prediction and control group. His major research areas include time series analysis, nonlinear stochastic dynamic modeling, predictive control, signal processing and their applications in neurosciences, control engineering, and financial engineering.

    While he was at ISM, Ozaki was engaged in various projects in applied time series analysis in science and engineering: EEG dynamic inverse problems, spatial time series modeling of fMRI data, causality analysis in behavioral science, modeling nonlinear dynamics in ship engineering, predictive control design in fossil power plant control, seasonal adjustment in official statistics, heteroscedastic modeling and risk-sensitive control in financial engineering, nonlinear dynamic modeling in macroeconomics, spectral analysis of seismology data, point process modeling of earthquake occurrence data, river-flow prediction in stochastic hydrology, etc.

    Ozaki retired from ISM in 2008. Since then he has been a visiting professor at Tohoku University, Sendai, Japan, and at Queensland University of Technology, Brisbane, Australia. He has been involved in supporting several research projects (in dynamic modeling of neuroscience data, fossil power plant control design, and risk-sensitive control in financial engineering) in universities and industry. He has also led, through his international research network, a time series research group called Akaike Innovation School from his office in Mount Fuji and organizes seminars every summer.

    "With more statisticians working in the direction of methodological and theoretical research with applications in the neurosciences, the present book is timely. The author is an expert statistician who has made significant contributions to the area of time series and stochastic processes, in addition to methodological developments … The book is impressive in terms of the breadth of its coverage of the models and the in-depth discussion on the theoretical properties of both discrete-time and continuous-time that are specific to neuroscience data … the numerous examples on constructing the state-space representation of the time series models were useful, as properly constructing state-space models can be challenging. Moreover, the numerous discussions on the computational challenges for estimating the parameters of state-space models were illuminating."
    Journal of the American Statistical Association, December 2014

    "This is a very unusual book on time series, with much that is new, innovative , and usually not found in other books on time series, for example multivariate AR models, multivariate dynamic models, causal analysis and the Doob decomposition, and so on. Among the major pleasures of browsing through the book are the acquaintance with ‘Laplace’s Demon’, seeing Pearsonian and multimodal distributions as stationary distributions for dynamic models, Einstein’s inductive use of Boltzmann entropy—to mention just a few of the novelties. But the hard core of the book is about state space modeling and its application to neuroscience data. The pages 331 through 351 are a richly textured but precise and detailed introduction to state space modeling. Here is a lovely summary by Ozaki that I have not seen elsewhere—it deals with time series dynamics …"
    —Jayanta K. Ghosh, International Statistical Review (2013), 81

    "This book is essential for every quantitative scientist who is interested in developing rigorous statistical models for analyzing brain signals. It is written by an expert statistician who has made significant contributions to the area of time series and stochastic processes. … His expertise on this subject and interest on the deep issues of statistical modeling of brain signals are clearly reflected in the character of this book. This book builds an important foundation for neurostatistics … it is truly unique in its treatment of the topic because it has an eye towards modeling brain signals, such as electroencephalograms and functional magnetic resonance images, and thus builds on the specifics that are directly relevant to these particular data. … At the University of California, Irvine, researchers have used this book recently and found it to be very helpful. Moreover, I intend to use this book as the primary text for a special topic course on neurostatistics in the Department of Statistics."
    —Hernando Ombao, Journal of Time Series Analysis, 2013