-
1 Chapter 1 Introduction
-
2 Chapter 2 Variational Calculus
-
2.1 Basic concepts and principles of variational method
-
2.2 Variational method to solve the optimal control problem
-
2.3 Direct solution to variational problems
-
3 Chapter 3 Pontryagin’s Principle
-
3.1 Introduction & A simple example
-
3.2 The minimum principle of continuous system
-
3.3 The minimum principle of discrete systems
-
3.4 Some typical optimal control problems
-
3.4.1 Minimal time problem
-
3.4.2 Minimal fuel problem
-
3.4.3 Minimal energy problem
-
4 Chapter 4 Dynamic Programming
-
4.1 The shortest route problem
-
4.2 Principle of Optimality
-
4.3 Basic recursion equations for dynamic programming
-
4.4 Numerical calculation method of dynamic programming
-
4.5 Dynamic planning of continuous control system
-
4.6 The relationship between dynamic programming and the minimum principle and variational method
-
5 Chapter 5 Linear Quadratic Regulator (LQR)
-
5.1 Linear quadratic optimal control problem
-
5.2 State regulator problem
-
5.3 Output regulator problem
-
5.4 Tracking problem
-
5.5 The problem of optimal regulator with specified stability
-
5.6 The state regulator problem under the action of step interference
-
5.7 The problem of optimal regulator with state on observer
-
5.8 Discrete system optimal regulator problem
-
6 Chapter 6 H∞ Optimization and Robust Control
-
6.1 Robust control problem
-
6.2 Relative mathematics knowledge
-
6.3 H∞ optimal control theory
-
7 Chapter 7 Receding Horizon Optimization and Model Predictive Control
-
7.1 Introduction to receding horizon optimization and model predictive control
-
7.2 Model algorithm control
-
7.3 Dynamic matrix control
-
7.4 Generalized predictive control
-
8 Chapter 8 Linear Optimization and Nonlinear Optimization
-
8.1 Linear optimization problem
-
8.2 Nonlinear optimization problems
-
8.3 Numerical calculation method
-
9 Chapter 9 Intelligent Optimization
-
10 Supplement
-
11 Good Homework Results
-
11.1 2022-3-22
-
11.2 2022-3-29
-
11.3 2022-4-19
-
12 Experiments