Skip to main content

Dependability Evaluation of The Ogsa-Dai Middleware

  • Chapter
Book cover Achievements in European Research on Grid Systems

Abstract

One important contribution to the community that is developing Grid middleware is the definition and implementation of benchmarks and tools to assess the performance and dependability of Grid applications and the corresponding middleware. In this paper, we present an experimental study that was conducted with OGSA-DAI, a popular package of middleware that provides access to remote data resources thought a unified Web-service front-end. The results show that OGSA-DAI is quite stable and performed quite well in scalability tests, executed on Grid5000. However, we also demonstrate that OGSA-DAI WSI is currently using a SOAP container (Apache Axis1.2.1) that suffers from severe memory leaks. We show how the default configuration of OGSA-DAI is not affected by that problem, but a small change in the configuration of a Web-service may lead to very unreliable execution of OGSA-DAI.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. P.Koopman, H.Madeira. “Dependability Benchmarking & Prediction: A Grand Challenge Technology Problem”, Proc. 1st IEEE Int. Workshop on Real-Time Mission-Critical Systems: Grand Challenge Problems; Phoenix, Arizona, USA, Nov 1999

    Google Scholar 

  2. M. Vieira and H. Madeira, “A Dependability Benchmark for OLTP Application Environments”, Proc. 29th Int. Conf. on Very Large Data Bases (VLDB-03), Berlin, 2003.

    Google Scholar 

  3. K. Buchacker and O. Tschaeche, “TPC Benchmark-c version 5.2 Dependability Benchmark Extensions”, http://www.faumachine.org/papers/tpcc-depend.pdf, 2003

    Google Scholar 

  4. D. Wilson, B. Murphy and L. Spainhower. “Progress on Defining Standardized Classes for Comparing the Dependability of Computer Systems”, Proc. DSN 2002, Workshop on Dependability Benchmarking, Washington, D.C., USA, 2002.

    Google Scholar 

  5. A. Kalakech, K. Kanoun, Y. Crouzet and A. Arlat. “Benchmarking the Dependability of Windows NT, 2000 and XP”, Proc. Int. Conf. on Dependable Systems and Networks (DSN 2004), Florence, Italy, 2004.

    Google Scholar 

  6. J. Duraes, H. Madeira, “Characterization of Operating Systems Behaviour in the Presence of Faulty Drivers Through Software Fault Emulation”, in Proc. 2002 Pacific Rim Int. Symposium Dependable Computing (PRDC-2002), pp. 201-209, Tsukuba, Japan, 2002.

    Google Scholar 

  7. A. Brown, L. Chung, and D. Patterson. “Including the Human Factor in Dependability Benchmarks”, Proc. of the 2002 DSN Workshop on Dependability Benchmarking, Washington, D.C., June 2002.

    Google Scholar 

  8. A. Brown, L. Chung, W. Kakes, C. Ling, D. A. Patterson, “Dependability Benchmarking of Human-Assisted Recovery Processes”, Dependable Computing and Communications, DSN 2004, Florence, Italy, June, 2004

    Google Scholar 

  9. A. Brown and D. Patterson, “Towards Availability Benchmarks: A Case Study of Software RAID Systems”, Proc. 2000 USENIX Annual Technical Conference, San Diego, June 2000

    Google Scholar 

  10. J. Zhu, J. Mauro, I. Pramanick. “R3 - A Framework for Availability Benchmarking”, Proc. Int. Conf. on Dependable Systems and Networks (DSN 2003), USA, 2003.

    Google Scholar 

  11. S. Lightstone, J. Hellerstein, W. Tetzlaff, P. Janson, E. Lassettre, C. Norton, B. Rajaraman and L. Spainhower. “Towards Benchmarking Autonomic Computing Maturity”, 1st IEEE Conf. on Industrial Automatics (INDIN-2003), Canada, August 2003.

    Google Scholar 

  12. A.Brown, J.Hellerstein, M.Hogstrom, T.Lau, S.Lightstone, P.Shum, M.P.Yost, “Benchmarking Autonomic Capabilities: Promises and Pitfalls”, Proc. Int. Conf. on Autonomic Computing (ICAC’04), 2004

    Google Scholar 

  13. A. Brown and J. Hellerstein, “An Approach to Benchmarking Configuration Complexity”, Proc. of the 11th ACM SIGOPS European Workshop, Leuven, Belgium, September 2004

    Google Scholar 

  14. A.Brown, C.Redlin. “Measuring the Effectiveness of Self-Healing Autonomic Systems”, Proc. 2nd Int. Conf. on Autonomic Computing (ICAC’05), 2005

    Google Scholar 

  15. J. Durães, M. Vieira and H. Madeira. “Dependability Benchmarking of Web-Servers”, Proc. 23rd International Conference, SAFECOMP 2004, Potsdam, Germany, September 2004. Lecture Notes in Computer Science, Volume 3219/2004

    Google Scholar 

  16. L.Silva, H.Madeira and J.G.Silva “Software Aging and Rejuvenation in a SOAP-based Server”, IEEENCA: Network Computing and Applications, Cambridge USA, July 2006

    Google Scholar 

  17. D.Sousa, N.Rodrigues and L.Silva “How Reliable is WS-Reliable Messaging: An Experimental Study with Apache Sandesha“, Submitted for publication, 2006.

    Google Scholar 

  18. OGSA-DAI: http://www.ogsadai.org.uk/

    Google Scholar 

  19. Web Services Interoperability (WS-I) http://www.ws-i.org

    Google Scholar 

  20. WSRF, http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=wsrf

    Google Scholar 

  21. Jakarta Tomcat: http://jakarta.apache.org/tomcat

    Google Scholar 

  22. Apache Axis : http://ws.apache.org/axis/

    Google Scholar 

  23. Globus toolkit: http://www.globus.org/toolkit/

    Google Scholar 

  24. Projects that use OGSA-DAI : http://www.ogsadai.org.uk/about/projects.php

    Google Scholar 

  25. Parasoft SOAtest: http://www.parasoft.com

    Google Scholar 

  26. PushToTest TestMaker: http://www.pushtotest.com

    Google Scholar 

  27. William Hoarau, Sébastien Tixeuil, and Fabien Vauchelles. “Easy fault injection and stress testing with FAIL-FCI”. Technical Report 1421, Laboratoire de Recherche en Informatique, Université Paris Sud, October 2005

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Hoarau, W., Tixeuil, S., Rodrigues, N., Sousa, D., Silva, L. (2008). Dependability Evaluation of The Ogsa-Dai Middleware. In: Gorlatch, S., Bubak, M., Priol, T. (eds) Achievements in European Research on Grid Systems. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-72812-4_17

Download citation

  • DOI: https://doi.org/10.1007/978-0-387-72812-4_17

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-0-387-72811-7

  • Online ISBN: 978-0-387-72812-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics