Sudipto Banerjee, Anindya Roy

June 6, 2014
by Chapman and Hall/CRC

Textbook
- 580 Pages
- 10 B/W Illustrations

ISBN 9781420095388 - CAT# K10023

Series: Chapman & Hall/CRC Texts in Statistical Science

**For Instructors** Request an e-inspection copy

**For Librarians** Available on Taylor & Francis eBooks >>

was $92.95

USD^{$}74^{.36}

SAVE *~*$18.59

Add to Wish List

FREE Standard Shipping!

- Provides in-depth coverage of important topics in linear algebra that are useful for statisticians, including the concept of rank, the fundamental theorem of linear algebra, projectors, and quadratic forms
- Requires no prior knowledge of linear algebra—not even at the undergraduate level
- Shows how the same result can be derived using multiple techniques
- Describes several computational techniques for orthogonal reduction
- Highlights popular algorithms for eigenvalues and eigenvectors of both symmetric and unsymmetric matrices
- Presents an accessible proof of Jordan decomposition
- Includes material relevant in multivariate statistics and econometrics, such as Kronecker and Hadamard products
- Offers an extensive collection of exercises on theoretical concepts and numerical computations

**Linear Algebra and Matrix Analysis for Statistics** offers a gradual exposition to linear algebra without sacrificing the rigor of the subject. It presents both the vector space approach and the canonical forms in matrix theory. The book is as self-contained as possible, assuming no prior knowledge of linear algebra.

The authors first address the rudimentary mechanics of linear systems using Gaussian elimination and the resulting decompositions. They introduce Euclidean vector spaces using less abstract concepts and make connections to systems of linear equations wherever possible. After illustrating the importance of the rank of a matrix, they discuss complementary subspaces, oblique projectors, orthogonality, orthogonal projections and projectors, and orthogonal reduction.

The text then shows how the theoretical concepts developed are handy in analyzing solutions for linear systems. The authors also explain how determinants are useful for characterizing and deriving properties concerning matrices and linear systems. They then cover eigenvalues, eigenvectors, singular value decomposition, Jordan decomposition (including a proof), quadratic forms, and Kronecker and Hadamard products. The book concludes with accessible treatments of advanced topics, such as linear iterative systems, convergence of matrices, more general vector spaces, linear transformations, and Hilbert spaces.