Optimal Control

Basic information


45 hours


Functional Analysis: Fundamentals, Functional Analysis: Linear Operators, Measure, Integration and Probability


 Examples, basic properties of control systems, existence of a solution, continuity with respect to initial conditions and controls. Controllability of linear systems, non-linear controllability with Lie brackets. Lyapunov stability, retroaction laws. Optimal control problems, existence of optimal solution, optimality conditions, Pontryagin's Maximum Principle, Legendre-Clebsch conditions, second order conditions. Systems related to control, bang-bang solutions, bang-bang principle, singular arcs. Dynamic programming, Hamilton-Jacobi-Bellman equation. Optimal linear-quadratic control. Shooting method, straight methods. Applications.



· A. Bressan and B. Piccoli. Introduction to the mathematical theory of control, volume 2 of AIMS Series on Applied Mathematics. American Institute of Mathematical Sciences (AIMS), Springfield, MO, 2007
· H. Schättler and U. Ledzewicz. Geometric optimal control: theory, methods and examples, volume 38. Springer Science & Business Media, 2012
· R. Vinter. Optimal control. Modern Birkhäuser Classics. Birkhäuser Boston, Inc., Boston, MA, 2010. Paperback reprint of the 2000 edition


·       F. Clarke. Functional analysis, calculus of variations and optimal control, volume 264. Springer, 2013
·       A.E. Bryson and Y.-C. Ho. Applied optimal control. Hemisphere Publishing, New-York, 1975
·       B. Chachuat. Nonlinear and dynamic optimization: from theory to practice. Technical report, 2007
·       J.-M. Coron. Control and nonlinearity. Number 136. American Mathematical Soc., 2007
·       L.C. Evans. An introduction to mathematical optimal control theory. Lecture Notes, University of California, Department of Mathematics, Berkeley, 2005