Abstract:
An optimal stochastic control problem is considered for systems with unbounded controls satisfying an integral constraint. It is shown that there exists an optimal contro...Show MoreMetadata
Abstract:
An optimal stochastic control problem is considered for systems with unbounded controls satisfying an integral constraint. It is shown that there exists an optimal control within the class of generalized controls leading to impulse actions. Applying an approach of time transformation, developed recently for deterministic systems, the original control problem is shown to be equivalent to an optimal stopping problem. Moreover, the description of generalized solutions is given in terms of stochastic differential equations governed by a measure.
Published in: 2001 European Control Conference (ECC)
Date of Conference: 04-07 September 2001
Date Added to IEEE Xplore: 27 April 2015
Print ISBN:978-3-9524173-6-2