Abstract:
In case of model uncertainty is located in parameters (interval model), an interval observer has been shown to be a suitable strategy to generate an adaptive threshold to...Show MoreMetadata
Abstract:
In case of model uncertainty is located in parameters (interval model), an interval observer has been shown to be a suitable strategy to generate an adaptive threshold to be used in residual evaluation. In interval observer-based fault detection methods, the observer gain plays an important role since it determines the minimum detectable fault for a given type of fault and allows enhancing the observer fault detection properties while diminishing model computational drawbacks (i.e. wrapping effect, computational complexity). In this paper, the effect of the observer gain on the time evolution of the residual sensitivity to a fault is analyzed. Then, using these sensitivity studies, the minimum detectable fault time evolution is established. Thus, three types of faults according their detectability time evolution are introduced: permanently (strongly) detected, non-permanently (weakly) detected or just non-detected. Finally, an example based on a mineral grinding- classification process will be used to illustrate the results derived.
Published in: 2007 46th IEEE Conference on Decision and Control
Date of Conference: 12-14 December 2007
Date Added to IEEE Xplore: 21 January 2008
ISBN Information:
Print ISSN: 0191-2216