Repairing and reasoning with inconsistent and uncertain ontologies

https://doi.org/10.1016/j.advengsoft.2011.10.015Get rights and content

Abstract

With the development of semantic web, the quality and correctness of ontologies play more and more important roles in semantic representation and knowledge sharing. However, ontologies are often inconsistent and uncertain in real situations. Because of the difficulty in ensuring the quality of ontologies, there is an increasing need for dealing with the inconsistency and uncertainty in real-world applications of ontological reasoning and management. This paper adopts two methods to handle the inconsistent and uncertain ontologies: the first one is to repair the inconsistency, algorithms RIO and RIUO are proposed to compute the candidate repair set, the consistency of ontology could be recovered through deleting or modifying the axioms in candidate repair set; the second one is to develop a non-standard reasoning method to obtain meaningful answers, algorithms RMU and RMIU are proposed to perform query-specific reasoning methods for inconsistent and uncertain ontologies without changing the original ontologies. Finally the prototype system is constructed and the experiment results validate the usability and effectiveness of our approaches.

Highlights

► Propose algorithms RIO and RIUO to repair inconsistent and uncertain ontology. ► Propose algorithms RMU and RMIU to reason with inconsistent and uncertain ontology. ► Query-specific reasoning methods obtain correct answers under inconsistent ontology. ► Prototype systems validate the effectiveness of proposed approaches.

Introduction

Ontologies play an important role in the Semantic Web [1]. The development of Semantic Web is highly dependent on the quality and correctness of ontologies. However, in real situations, ontologies are often inconsistent and uncertain.

An inconsistent ontology means that an error or a conflict exists in this ontology, as a result some concepts in the ontology cannot be interpreted correctly. The inconsistencies of ontologies may come from mis-presentation, polysemy, migration from another formalism, and integration from multiple sources. The inconsistency will result in wrong answers during ontology reasoning, and also result in false semantic understanding and knowledge representation.

Existing reasoners are able to detect inconsistencies in ontologies, however, there is relatively limited support for resolving the problems. Some researchers have already addressed this issue [2], [3], [4], [5], [6], but their methods are either low efficient in calculation or specific to certain domain, and cannot find all the solutions to repair the inconsistency. Accordingly, the ability to efficiently resolve the inconsistency and find a best solution is of utmost importance in ontological reasoning and management.

Additionally, sometimes ontologies have uncertainty. An uncertain ontology means that the correctness of the ontology is probabilistic. Without a means of expressing uncertainty we are unable to say much of what we know. There are mainly three sources of uncertain ontologies: (1) subjective uncertainty of experts when they build the ontologies, (2) the uncertainty from original ontologies when the ontology is integrated from several sub-ontologies, and (3) the uncertainty from (semi-) automatically ontology learning tools.

Existing methods [7], [8], [9], [10], [11] can reduce the uncertainty to some extent but cannot resolve this problem very well, especially when the inconsistency and uncertainty happen simultaneously. So the need for dealing with uncertainty in semantically aware systems also becomes a serious problem.

Because the inconsistent and uncertain ontologies widely exist in semantic web, and it is always very hard to ensure the quality of ontologies, this paper proposes two methods for dealing with inconsistent and uncertain ontologies.

The first one is to repair the error whenever an inconsistency is encountered. First the minimal inconsistent subset, extended error set and candidate repair set of the ontology are defined, then the algorithm RIO is presented to compute the candidate repair set. The consistency of ontology could be recovered efficiently through deleting or modifying the axioms in candidate repair set. Then the confidence of the ontology is further considered to measure how confident the user is of the correctness of the elements, and the algorithm RIUO is proposed to repair the inconsistency and uncertainty simultaneously.

The second method is to develop a non-standard reasoning method to obtain meaningful answers. Algorithm RMU is presented to reason with uncertain ontologies; algorithm RMIU is proposed to perform a query-specific reasoning method for inconsistent and uncertain ontologies without changing the original ontologies. The reasoning route and certainty degree of each answer are also provided to the user for facilitating his selection of the most credible answer.

The primary contributions of this paper are summarized as follows:

  • 1.

    Generally speaking, there is more than one solution that could recover the consistency of ontology. But existing methods cannot find all the solutions. Our methods compute the candidate repair sets that include all the possible solutions, also the best solution. Thus it is more convenient for the users to choose the most suitable solution according to their knowledge and specific scenario, and do analysis and make customized modifications.

  • 2.

    The reasoning methods for inconsistent and uncertain ontologies adopt an incrementally selection function, which is capable of selecting elements related to a specific query. Accordingly, the elements necessary for the reasoning could be obtained fast and the query results could be returned efficiently.

  • 3.

    The query results achieved by our method include not only “True” and “False” answers, but also the specific reasoning route (path) and certainty degree of each answer. In this way, the user may obtain more useful information to facilitate his selection of the most credible query result according to the certainty degree of each result.

  • 4.

    The empirical study of the proposed methods is reported. The prototype system constructs an ontology for testing and the evaluation results validate the usability and promise of our approaches.

The rest of the paper is organized as follows: Section 2 introduces relevant terms and definitions as preliminaries. Section 3 describes the algorithms RIO and RIUO for repairing inconsistent and uncertain ontologies. Section 4 presents the algorithms RMU and RMIU to reason with inconsistent and uncertain ontologies. The system implementation and experiment results are discussed in Section 5. Section 6 reviews the related work of inconsistent and uncertain ontology handling. Finally, the conclusions and future work are depicted in Section 7.

Section snippets

Description logic

Description Logics (or DL for short) are a family of well-studied set-description languages which have been in use to formalize knowledge for over two decades [3]. They have a well-defined model theoretic semantics, which allows for automating a number of reasoning services.

DL is equipped with a formal, logic-based semantics. A distinguished feature is the emphasis on reasoning as a central service: reasoning allows one to infer implicitly represented knowledge from the knowledge that is

Repairing inconsistent and uncertain ontologies

In this section, two methods for repairing inconsistent ontologies are proposed and linked to DL-based systems. Method one could efficiently tackle the inconsistent problem of ontologies. Method two could further resolve the inconsistent and uncertain problem of ontologies at the same time.

The key idea of our method is as follows: When an ontology encounters the inconsistency, first the minimal inconsistent subset (MIS) is calculated, and then the extended error sets are obtained, which contain

Reasoning with inconsistent and uncertain ontologies

In this section, we will illustrate the reasoning method for the inconsistent and uncertain ontologies. It is achievable to obtain the query results most probable to be correct, which exempts the user’s cost in revising the existing ontologies.

First the key steps, Possibility extension, Selection function and Reasoning route, are introduced, then the algorithms RMU and RMIU are given out with an example.

System and implementation

To validate the proposed methods in this paper, the experiments are conducted from two aspects. Section 5.1 introduces the experiment on repairing inconsistent and uncertain ontologies, and Section 5.2 represents the experiment on reasoning with inconsistent and uncertain ontologies.

Related work

Normally, there are two main strategies to deal with inconsistent ontologies.

One is to resolve the error whenever an inconsistency is encountered. This method tries to find the minimal subsets of an ontology that need to be repaired or removed to render an ontology logically correct. Schlobach et al. [3] calculated the diagnoses of inconsistent ontology from pinpoints, but could not find all the possible solutions. Thus their method may lose the best solution. Deng et al. [4] utilized Shapley

Conclusions and future work

Generally speaking, existing ontologies have four sources: (1) built by people manually, (2) integrated from several ontologies, (3) evolved from a former ontology, (4) (semi-) automatically learnt by some tools. No matter which source, it is very hard to ensure the quality and correctness of the ontology. Ontology reasoners are able to detect inconsistencies in ontologies but have limited support for resolving this problem.

So this paper proposes two kinds of methods to deal with the

References (25)

  • Lee TB. Semantic web road map. W3C design issues. <http://www.w3.org/DesignIssues/Semantic.html>;...
  • Huang ZS, Harmelen VF, Teije AT. Reasoning with Inconsistent Ontologies. In: Proceedings of IJCAI’05; 2005. p....
  • Schlobach S, Huang ZS. Inconsistent ontology diagnosis and repair. <http://wasp.cs.vu.nl/sekt/dion/sekt363.pdf>;...
  • X. Deng et al.

    Measuring inconsistencies in ontologies

    Lect Notes Comput Sci

    (2007)
  • Kalyanpur A. Debugging and repair of OWL ontologies. PhD thesis, University of Maryland;...
  • S. Gao et al.

    Ontology-based semantic matchmaking approach

    Adv Eng Softw

    (2007)
  • Z. Ding et al.

    A probabilistic extension to ontology language OWL

  • F. Bacchus

    Representing and reasoning with probabilistic knowledge

    (1990)
  • Koller D, Levy A, Pfeffer A. P-classic: a tractable probabilistic description logic. In: Proceedings of AAAI-97; 1997....
  • Straccia U. Towards a fuzzy description logic for the semantic web (preliminary report). In: Proceedings of the second...
  • Lam SC, Pan JZ, Sleeman D, Vasconcelos W. Ontology inconsistency handling: ranking and rewriting axioms. Technical...
  • F. Baader et al.

    The description logic handbook: theory, implementation, and applications

    (2003)
  • Cited by (15)

    • Understandable Big Data: A survey

      2015, Computer Science Review
      Citation Excerpt :

      About velocity, gazetteers and knowledge bases must be continually updated [88,45] and data processed periodically [43,42]. Similarly if we want to tackle variety, we must deal with various data formats (tweets in [45,46,88] and natural language texts [47,80,62,76]) and distributed data [38,39]. As [13] said, Big Data must be addressed jointly and on each axis to make significant improvement in its management.

    • Emerging information technologies for enhanced healthcare

      2015, Computers in Industry
      Citation Excerpt :

      An optimal method of k-anonymisation is provided in [76,95] for de-identification of personal health data. The combinations of cryptography and data security protocols [75,78,79,82–84,106] are employed to handle the security and privacy issues for the development of secure healthcare systems. In particular, the attribute-based cryptography [78] is used to construct a secure and privacy-preserving EHR system that enables patients to share their data in the cloud.

    • Measuring effectiveness of ontology debugging systems

      2014, Knowledge-Based Systems
      Citation Excerpt :

      These reasoning tasks have been proven essential to design and maintain high-quality ontologies [84] and answer queries [37]. In practice, building ontologies is an error-prone effort and logical contradictions are always unavoidable [50,52,55,56,62,69,83]. When an ontology contains errors, some entailments of the ontology can be undesirable.

    View all citing articles on Scopus
    View full text