As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
The paper proposes an extension of “Prioritized Removed Sets Revision” (PRSR) to DL-LiteR stratified knowledge bases. The revision strategy is based on inconsistency minimization and consists in determining smallest subsets of assertions to be dropped from the current DL-LiteR knowledge base, taking the stratification into account, in order to restore consistency and accept the input. We consider different forms of input: membership assertion, positive inclusion axiom or negative inclusion axiom. We show that according to the form of input and under some conditions PRSR can be achieved in polynomial time.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.