Abstract
Time optimal control problems for systems with impulsive controls are investigated. Sufficient conditions for the existence of time optimal controls are given. A dynamical programming principle is derived and Lipschitz continuity of an appropriately defined value functional is established. The value functional satisfies a Hamilton–Jacobi–Bellman equation in the viscosity sense. A numerical example for a rider-swing system is presented and it is shown that the reachable set is enlargered by allowing for impulsive controls, when compared to nonimpulsive controls.
| Original language | English |
|---|---|
| Pages (from-to) | 75-97 |
| Number of pages | 23 |
| Journal | Applied Mathematics and Optimization |
| Volume | 75 |
| Issue number | 1 |
| DOIs | |
| Publication status | Published - 1 Feb 2017 |
| Externally published | Yes |
Keywords
- Hamilton–Jacobi–Bellman equations
- Impulsive differential equations
- Optimal control