Leonard David Berkovitz, Negash G. Medhin

August 25, 2012
by Chapman and Hall/CRC

Reference
- 392 Pages
- 12 B/W Illustrations

ISBN 9781466560260 - CAT# K15884

Series: Chapman & Hall/CRC Applied Mathematics & Nonlinear Science

**For Librarians** Available on CRCnetBASE >>

USD^{$}109^{.95}

Add to Cart

Add to Wish List

FREE Standard Shipping!

- Covers the main topics of optimal control theory, including the Pontryagin principle, Bellman’s dynamic programming method, and theorems about the existence of optimal controls
- Presents diverse examples from flight mechanics, chemical and electrical engineering, production planning models, and the classical calculus of variations
- Describes control system models with and without time delays
- Presents proofs of the Pontryagin maximum principle
- Includes an introduction to Hamilton-Jacobi theory, tying together the viscosity solution and Subbotin approaches

**Nonlinear Optimal Control Theory** presents a deep, wide-ranging introduction to the mathematical theory of the optimal control of processes governed by ordinary differential equations and certain types of differential equations with memory. Many examples illustrate the mathematical issues that need to be addressed when using optimal control techniques in diverse areas.

Drawing on classroom-tested material from Purdue University and North Carolina State University, the book gives a unified account of bounded state problems governed by ordinary, integrodifferential, and delay systems. It also discusses Hamilton-Jacobi theory. By providing a sufficient and rigorous treatment of finite dimensional control problems, the book equips readers with the foundation to deal with other types of control problems, such as those governed by stochastic differential equations, partial differential equations, and differential games.