Hamilton-Jacobi theory over time scales and applications to linear-quadratic problems
Authors | |
---|---|
Year of publication | 2012 |
Type | Article in Periodical |
Magazine / Source | Nonlinear Analysis, Theory, Methods & Applications |
MU Faculty or unit | |
Citation | |
Doi | http://dx.doi.org/10.1016/j.na.2011.09.027 |
Field | General mathematics |
Keywords | Hamilton-Jacobi theory; Verification theorem; Bellman principle; Dynamic programming; Hamilton-Jacobi-Bellman equation; Value function; Linear-quadratic problem; Riccati equation; Feedback controller; Symplectic system; Weak Pontryagin principle |
Attached files | |
Description | In this paper we first derive the verification theorem for nonlinear optimal control problems over time scales. That is, we show that the value function is the only solution of the Hamilton-Jacobi equation, in which the minimum is attained at an optimal feedback controller. Applications to the linear-quadratic regulator problem (LQR problem) gives a feedback optimal controller form in terms of the solution of a generalized time scale Riccati equation, and that every optimal solution of the LQR problem must take that form. A connection of the newly obtained Riccati equation with the traditional one is established. Problems with shift in the state variable are also considered. As an important tool for the latter theory we obtain a new formula for the chain rule on time scales. Finally, the corresponding LQR problem with shift in the state variable is analyzed and the results are related to previous ones. |
Related projects: |