Abstract:
In an earlier paper [A. Arapostathis, et al., 2003], we introduced the notion of safety control of stochastic discrete event systems (DESs), modeled as controlled Markov ...Show MoreMetadata
Abstract:
In an earlier paper [A. Arapostathis, et al., 2003], we introduced the notion of safety control of stochastic discrete event systems (DESs), modeled as controlled Markov chains. Safety was specified as an upper bound on the components of the state probability distribution, and the class of irreducible and aperiodic Markov chains was analyzed for satisfying such a safety property. Under the assumption of complete state observation, we identified (i) the set of all safety enforcing state-feedback controllers that impose the safety requirement for all safe initial distributions, and (ii) the maximal invariant set of safe distributions for a state-feedback controller. In this paper we extend the work reported in [A. Arapostathis, et al., 2003] in several ways: (i) the safety is specified in terms of both upper and lower bounds; (ii) quite general class of Markov chains is analyzed that does not exclude the reducible or the periodic chains, (iii) a quite general iterative algorithm for computing the maximal invariant set of safe distributions is obtained in which the initial set for iteration can be arbitrary; (iv) explicit upper bound on the number of steps needed for the termination of the iterative algorithm has been obtained.
Date of Conference: 09-12 December 2003
Date Added to IEEE Xplore: 15 March 2004
Print ISBN:0-7803-7924-1
Print ISSN: 0191-2216