Optimization

Basic information

Workload: 

45 hours

Syllabus: 

Deterministic optimization:

  • Convexity. Properties of convex and strongly convex functions.
  • First and second order optimality conditions. Lagrange multipliers and duality.
  • Gradient method.
  • Line searches.
  • Newton and quasi-Newton methods.
  • Subgradient method.
  • Conjugate gradient.
  • Usawa method.
  • Cutting plane and bundle methods.
  • Dynamic and dual dynamic programming with cut selection.
  • Implementation of numerical optimization algorithms.

Stochastic optimization:

  • Risk measures.
  • Chance-constrained problems.
  • Robust Stochastic Approximation.
  • Stochastic Mirror Descent.
  • Multi-cut decomposition methods with cut selection.

Bibliography

Mandatory: 

•    M. Bandarra and V. Guigues. Multicut decomposition methods with cut selection for multistage stochastic programs. On-Line Optimization, 2017. 
•    J.F. Bonnans, J.C. Gilbert, C. Lemarechal and C. Sagastiz ́ abal. `Numerical optimization: theoretical and practical aspects. Springer, 2003. 
•    V. Guigues. Descent of stochastic mirror in several stages for convex risk-averse stochastic programs based on extended measures of polyhedral risk. Mathematical programming, 163: 169-212, 2016. 
•    Shapiro, D. Dentcheva and A. Ruszczynski. `` Lectures on Stochastic Programming: Modeling and Theory. SIAM, Philadelphia, 2009.