WIAS Preprint No. 3168, (2025)

Continuous time stochastic optimal control under discrete time partial observations



Authors

  • Bayer, Christian
    ORCID: 0000-0002-9116-0039
  • Djehiche, Boualem
  • Rezvanova, Eliza
  • Tempone, Raúl

2020 Mathematics Subject Classification

  • 60H10 60H35 93E20

Keywords

  • Markov decision process, stochastic optimal control, filtering, partially observed state, Bayesian updates, measure-valued process

DOI

10.20347/WIAS.PREPRINT.3168

Abstract

This work addresses stochastic optimal control problems where the unknown state evolves in continu- ous time while partial, noisy, and possibly controllable measurements are only available in discrete time. We develop a framework for controlling such systems, focusing on the measure-valued process of the system's state and the control actions that depend on noisy and incomplete data. Our approach uses a stochastic optimal control framework with a probability measure-valued state, which accommodates noisy measure- ments and integrates them into control decisions through a Bayesian update mechanism. We characterize the control optimality in terms of a sequence of interlaced Hamilton Jacobi Bellman (HJB) equations coupled with controlled impulse steps at the measurement times. For the case of Gaussian-controlled processes, we derive an equivalent HJB equation whose state variable is finite-dimensional, namely the state's mean and covariance. We demonstrate the effectiveness of our methods through numerical examples. These include control under perfect observations, control under no observations, and control under noisy observa- tions. Our numerical results highlight significant differences in the control strategies and their performance, emphasizing the challenges and computational demands of dealing with uncertainty in state observation.

Download Documents