Minimal Time Problem with Impulsive Controls

Karl Kunisch, Zhiping Rao*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

Time optimal control problems for systems with impulsive controls are investigated. Sufficient conditions for the existence of time optimal controls are given. A dynamical programming principle is derived and Lipschitz continuity of an appropriately defined value functional is established. The value functional satisfies a Hamilton–Jacobi–Bellman equation in the viscosity sense. A numerical example for a rider-swing system is presented and it is shown that the reachable set is enlargered by allowing for impulsive controls, when compared to nonimpulsive controls.

Original languageEnglish
Pages (from-to)75-97
Number of pages23
JournalApplied Mathematics and Optimization
Volume75
Issue number1
DOIs
Publication statusPublished - 1 Feb 2017
Externally publishedYes

Keywords

  • Hamilton–Jacobi–Bellman equations
  • Impulsive differential equations
  • Optimal control

Cite this