期刊:IEEE transactions on systems, man, and cybernetics [Institute of Electrical and Electronics Engineers] 日期:2023-06-30卷期号:53 (10): 6375-6387被引量:14
标识
DOI:10.1109/tsmc.2023.3284612
摘要
In this article, we study the optimal control problem of continuous-time (CT) time-invariant nonlinear systems with stochastic nonlinear disturbances. A new stochastic adaptive dynamic programming (ADP) method is developed to solve the Hamilton–Jacobi–Bellman equation (HJBE). Under the conditional expectation, the value function and the control law are successively approximated simultaneously. The asymptotic stability of the closed-loop stochastic system in probability is analyzed by the stochastic Lyapunov direct method, and the convergence of the developed ADP method is given. Finally, four simulations illustrate the effectiveness of the developed method.