Optimal Control for Chemical EngineersThis self-contained book gives a detailed treatment of optimal control theory that enables readers to formulate and solve optimal control problems. With a strong emphasis on problem solving, it provides all the necessary mathematical analyses and derivations of important results, including multiplier theorems and Pontryagin's principle. The text presents various examples and basic concepts of optimal control and describes important numerical methods and computational algorithms for solving a wide range of optimal control problems, including periodic processes. |
Contents
1 | |
2 Fundamental Concepts | 23 |
3 Optimality in Optimal Control Problems | 57 |
4 Lagrange Multipliers | 87 |
5 Pontryagins Minimum Principle | 123 |
6 Different Types of Optimal Control Problems | 153 |
7 Numerical Solution of Optimal Control Problems | 185 |
8 Optimal Periodic Control | 235 |
9 Mathematical Review | 267 |
Index | 285 |
Other editions - View all
Common terms and phrases
admissible algorithm apply assume augmented functional becomes boundary Chapter coefficient concentration Consider constant constraints continuous function control function convergence corresponding costate equations defined definition denote dependent determine differential equation direction equality constraints equivalent Example exists expressed Figure final fixed Gâteaux differential given gradient Hamiltonian Hence improvement independent variable inequality inequality constraints initial conditions integral interval 0,t Lagrange Multiplier less linear matrix maximum Mean Value Theorem method minimize minimum necessary conditions norm Note objective functional obtain optimal control problem optimum pair partial derivatives periodic perturbation positive principle provides reactor real number respect result Rule satisfied shown side solution solve specified steady Step subinterval sufficiently temperature Theorem tion variation vector yields zero