Constrained Markov Decision Processes
Markov Processes for Stochastic Modeling
March 30, 1999
This book provides a unified approach for the study of constrained Markov decision processes with a finite state space and unbounded costs. Unlike the single controller case considered in many other books, the author considers a single controller with several objectives, such as minimizing delays...
Marcel F. Neuts
July 01, 1995
This unique text collects more than 400 problems in combinatorics, derived distributions, discrete and continuous Markov chains, and models requiring a computer experimental approach. The first book to deal with simplified versions of models encountered in the contemporary statistical or...
Gennady Samoradnitsky, M.S. Taqqu
June 01, 1994
This book presents similarity between Gaussian and non-Gaussian stable multivariate distributions and introduces the one-dimensional stable random variables. It discusses the most basic sample path properties of stable processes, namely sample boundedness and continuity....