Skip to main content

Assessing Iterations of an Automated Ontology Evaluation Procedure

  • Conference paper
On the Move to Meaningful Internet Systems, OTM 2010 (OTM 2010)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 6427))

Abstract

Evaluation of ontologies is increasingly becoming important as the number of available ontologies is steadily growing. Ontology evaluation is a labour intensive and laborious job. Hence, the need grows to come up with automated methods for ontology evaluation. In this paper, we report on experiments using a light-weight automated ontology evaluation procedure (called EvaLexon) developed earlier. The experiments are meant to test if the automated procedure can detect an improvement (or deterioration) in the quality of an ontology miner’s output. Four research questions have been formulated on how to compare two rounds of ontology mining and how to assess the potential differences in quality between the rounds. The entire set-up and software infrastructure remain identical during the two rounds of ontology mining and evaluation. The main difference between the two rounds is the upfront manual removal by two human experts separately of irrelevant passages from the text corpus. Ideally, the EvaLexon procedure evaluates the ontology mining results in a similar way as the human experts do. The experiments show that the automated evaluation procedure is sensitive enough to detect a deterioration of the miner output quality. However, this sensitivity cannot be reliably qualified as similar to the behaviour of human experts as the latter seem to disagree themselves largely on which passages (and triples) are relevant or not. Novel ways of organising community-based ontology evaluation might be an interesting avenue to explore in order to cope with disagreements between evaluating experts.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Almeida, M.: A proposal to evaluate ontology content. Journal of Applied Ontology 4, 245–265 (2009)

    Google Scholar 

  2. Brank, J., Grobelnik, M., Mladenić, D.: Ontology evaluation. SEKT Deliverable #D1.6.1, Jozef Stefan Institute, Prague (2005)

    Google Scholar 

  3. Buchholz, S., Veenstra, J., Daelemans, W.: Cascaded grammatical relation assignment. In: Proceedings of EMNLP/VLC 1999. PrintPartners Ipskamp (1999)

    Google Scholar 

  4. Buitelaar, P., Cimiano, P. (eds.): Ontology Learning and Population: Bridging the Gap between Text and Knowledge. Frontiers in Artificial Intelligence and Applications, vol. 167. IOS Press, Amsterdam (2008)

    MATH  Google Scholar 

  5. Buitelaar, P., Cimiano, P., Magnini, B. (eds.): Ontology Learning from Text: Methods, Applications and Evaluation. IOS Press, Amsterdam (2005)

    Google Scholar 

  6. Burton-Jones, A., Storey, V., Sugumaran, V.: A semiotic metrics suite for assessing the quality of ontologies. Data and Knowledge Engineering 55(1), 84–102 (2005)

    Article  Google Scholar 

  7. De Leenheer, P., Christiaens, S., Meersman, R.: Business semantics management: a case study for competency-centric HRM. Journal of Computers For Industry (2009)

    Google Scholar 

  8. de Moor, A., De Leenheer, P., Meersman, R.: DOGMA-MESS: A meaning evolution support system for interorganizational ontology engineering. In: Schärfe, H., Hitzler, P., Øhrstrøm, P. (eds.) ICCS 2006. LNCS (LNAI), vol. 4068, pp. 189–203. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  9. Dellschaft, K., Staab, S.: On how to perform a gold standard based evaluation of ontology learning. In: Cruz, I., Decker, S., Allemang, D., Preist, C., Schwabe, D., Mika, P., Uschold, M., Aroyo, L.M. (eds.) ISWC 2006. LNCS, vol. 4273, pp. 228–241. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  10. Dellschaft, K., Staab, S.: Strategies for the Evaluation of Ontology Learning. In: Ontology Learning and Population: Bridging the Gap between Text and Knowledge. IOS Press, Amsterdam (2008)

    Google Scholar 

  11. Dividino, R., Romanelli, M., Sonntag, D.: Semiotic-based ontology evaluation tool (s-ontoeval). In: Calzolari, N., Choukri, K., Maegaard, B., Mariani, J., Odijk, J., Piperidis, S., Tapias, D. (eds.) Proceedings of the Sixth International Language Resources and Evaluation (LREC 2008), Paris. European Language Resources Association (2008)

    Google Scholar 

  12. Friedman, C., Hripcsak, G.: Evaluating natural language processors in the clinical domain. Methods of Information in Medicine 37(1-2), 334–344 (1998)

    Google Scholar 

  13. Gangemi, A., Catenacci, C., Ciaramita, M., Gil, R., Lehmann, J.: Ontology evaluation and validation: an integrated formal model for the quality diagnostic task. Technical report (2005), http://www.loa-cnr.it/Publications.html

  14. Gillam, L., Tariq, M.: Ontology via terminology? In: Ibekwe-San Juan, F., Lainé Cruzel, S. (eds.) Proceedings of the Workshop on Terminology, Ontology and Knowledge Representation (2004), http://www.univ-lyon3.fr/partagedessavoirs/termino2004/programgb.htm

  15. Grueninger, M., Fox, M.: Methodology for the design and evaluation of ontologies. In: Skuce, D. (ed.) IJCAI 1995 Workshop on Basic Ontological Issues in Knowledge Sharing (1995)

    Google Scholar 

  16. Hartmann, J., Spyns, P., Maynard, D., Cuel, R., Carmen Suarez de Figueroa, M., Sure, Y.: Methods for ontology evaluation. KnowledgeWeb Deliverable #D1.2.3 (2005)

    Google Scholar 

  17. Linstone, H.A., Turoff, M. (eds.): The Delphi Method: Techniques and Applications (2002)

    Google Scholar 

  18. Meersman, R.: The use of lexicons and other computer-linguistic tools in semantics, design and cooperation of database systems. In: Zhang, Y., Rusinkiewicz, M., Kambayashi, Y. (eds.) The Proceedings of the Second International Symposium on Cooperative Database Systems for Advanced Applications (CODAS 1999), pp. 1–14. Springer, Heidelberg (1999)

    Google Scholar 

  19. Meersman, R.: Ontologies and databases: More than a fleeting resemblance. In: d’Atri, A., Missikoff, M. (eds.) OES/SEO 2001 Rome Workshop. Luiss Publications (2001)

    Google Scholar 

  20. Navigli, R., Velardi, P.: Learning domain ontologies from document warehouses and dedicated web sites. Computational Linguistics 30(2), 151–179 (2004)

    Article  MATH  Google Scholar 

  21. Obrst, L., Ashpole, B., Ceusters, W., Mani, I., Ray, S., Smith, B.: Semantic Web: Revolutionizing Knowledge Discovery in the Life Sciences. In: The Evaluation of Ontologies: Toward Improved Semantic Interoperability, pp. 139–158. Springer, Heidelberg (2007)

    Google Scholar 

  22. Reinberger, M.-L., Spyns, P.: Unsupervised text mining for the learning of DOGMA-inspired ontologies. In: Buitelaar, P., Cimiano, P., Magnini, B. (eds.) Ontology Learning from Text: Methods, Applications and Evaluation, pp. 29–43. IOS Press, Amsterdam (2005)

    Google Scholar 

  23. Reinberger, M.-L., Spyns, P., Pretorius, A.J., Daelemans, W.: Automatic initiation of an ontology. In: Meersman, R., Tari, Z., et al. (eds.) OTM 2004. LNCS, vol. 3290, pp. 600–617. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  24. Spyns, P.: Validating EvaLexon: validating a tool for evaluating automatically lexical triples mined from texts. Technical Report x6, STAR Lab, Brussel (2007)

    Google Scholar 

  25. Spyns, P.: Evaluating automatically a text miner for ontologies: a catch-22 situation? In: Meersman, R., Tari, Z., Herrero, P., et al. (eds.) OTM 2008, Part II. LNCS, vol. 5332, pp. 1403–1421. Springer, Heidelberg (2008)

    Google Scholar 

  26. Spyns, P., Hogben, G.: Validating an automated evaluation procedure for ontology triples in the privacy domain. In: Moens, M.-F., Spyns, P. (eds.) Proceedings of the 18th Annual Conference on Legal Knowledge and Information Systems (JURIX 2005), pp. 127–136. IOS Press, Amsterdam (2005)

    Google Scholar 

  27. Spyns, P., Meersman, R., Jarrar, M.: Data modelling versus ontology engineering. SIGMOD Record Special Issue 31(4), 12–17 (2002)

    Article  Google Scholar 

  28. Spyns, P., Reinberger, M.-L.: Lexically evaluating ontology triples automatically generated from text. In: Gómez-Pérez, A., Euzenat, J. (eds.) ESWC 2005. LNCS, vol. 3532, pp. 563–577. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  29. Spyns, P., Tang, Y., Meersman, R.: An ontology engineering methodology for DOGMA. Journal of Applied Ontology 3, 13–39 (2008)

    Google Scholar 

  30. Zavitsanos, E., Paliouras, G., Vouros, G.: A distributional approach to evaluating ontology learning methods using a gold standard. In: Proceedings of the Third ECAI Ontology Learning and Population Workshop (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Spyns, P. (2010). Assessing Iterations of an Automated Ontology Evaluation Procedure. In: Meersman, R., Dillon, T., Herrero, P. (eds) On the Move to Meaningful Internet Systems, OTM 2010. OTM 2010. Lecture Notes in Computer Science, vol 6427. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-16949-6_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-16949-6_35

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-16948-9

  • Online ISBN: 978-3-642-16949-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics