Loading [a11y]/accessibility-menu.js
Observer gain effect in linear interval observer-based fault detection | IEEE Conference Publication | IEEE Xplore

Observer gain effect in linear interval observer-based fault detection


Abstract:

In case of model uncertainty is located in parameters (interval model), an interval observer has been shown to be a suitable strategy to generate an adaptive threshold to...Show More

Abstract:

In case of model uncertainty is located in parameters (interval model), an interval observer has been shown to be a suitable strategy to generate an adaptive threshold to be used in residual evaluation. In interval observer-based fault detection methods, the observer gain plays an important role since it determines the minimum detectable fault for a given type of fault and allows enhancing the observer fault detection properties while diminishing model computational drawbacks (i.e. wrapping effect, computational complexity). In this paper, the effect of the observer gain on the time evolution of the residual sensitivity to a fault is analyzed. Then, using these sensitivity studies, the minimum detectable fault time evolution is established. Thus, three types of faults according their detectability time evolution are introduced: permanently (strongly) detected, non-permanently (weakly) detected or just non-detected. Finally, an example based on a mineral grinding- classification process will be used to illustrate the results derived.
Date of Conference: 12-14 December 2007
Date Added to IEEE Xplore: 21 January 2008
ISBN Information:
Print ISSN: 0191-2216
Conference Location: New Orleans, LA, USA

Contact IEEE to Subscribe

References

References is not available for this document.