Abstract
This paper proposes a benchmark test management framework (BTMF) to simulate realistic database application environments based on TPC benchmarks. BTMF provides configuration parameters for both test system (TS) and system under test (SUT), so a more authentic SUT performance can be obtained by tuning these parameters. We use Petri net and transfer matrix to describe the intricate testing workload characteristics, so configuration parameters for different database applications can easily be determined. We conduct three workload characteristics experiments basing on the TPC-App benchmark to validate the BTMF and the workload modeling approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Buchacker, K., Tschaeche, O.: TPC Benchmark-c version 5.2 Dependability Benchmark Extensions (2004), http://www3.informatik.uni-erlangen.de/Research/FAUmachine/papers/tpcc-depend.pdf (accessed in July 2009)
Costa, D., Rilho, T., Madeira, H.: Joint Evaluation of Performance and Robustness of a COTS DBMS through Fault-Injection. In: The Proc. of DSN 2000, NY, USA (2000)
Du, N.Q., Ye, X.J., Wang, J.M.: Toward Workflow-Driven Database System Workload Modeling. In: The Proc. of DBTest 2009, Providence, USA (2009)
Galanis, L., et al.: Oracle Database Replay. In: The Proc. of ACM SIGMOD 2008, Vancouver, BC, Canada (2008)
Gray, J. (ed.): The Benchmark Handbook for Database and Transaction Processing Systems. Morgan Kaufmann Publishers, San Francisco (1993)
IBM. TPC BenchmarkTM App Full Disclosure Report for IBM® eServerTM xSeries® 366 using Microsoft® .NET 1.1 TPC-App Version 1.1 Submitted for Review (June 21, 2005)
HP LoadRunner, http://www.hp.com (accessed in July 2009)
Koziolek, H.: Introduction to Performance Metrics. In: Eusgeld, I., Freiling, F.C., Reussner, R. (eds.) Dependability Metrics. LNCS, vol. 4909, pp. 199–203. Springer, Heidelberg (2008)
Osogami, T., Kato, S.: Optimizing System Configurations Quickly by Guessing at the Performance. In: The Proc. of SIGMETRICS 2007, San Diego, USA (2007)
Seng, J.L., Yao, S.B., Hevner, A.R.: Requirements-Driven Database Systems Benchmark Method. Decision Support Systems 38, 629–648 (2005)
Swisher, J.R., Jacobson, S.H., Yucesan, E.: Discrete-Event Simulation Optimization Using Ranking, Selection, and Multiple Comparison Procedures: A Survey. ACM Transactions on Modeling and Computer Simulation 13(2), 134–154 (2003)
Transaction Processing Performance Council, TPC-C/App/E BENCHMARKTM Standard Specification, http://www.tpc.org (accessed in July 2009)
Xie, J.M., Ye, X.J.: A Configurable Web Service Performance Testing Framework. In: Proc. of IEEE HPCC 2008, Dalian, China (2008)
Zhang, Y., Qu, W., Liu, A.: Automatic Performance Tuning for J2EE Application Server Systems. In: Ngu, A.H.H., et al. (eds.) WISE 2005. LNCS, vol. 3806, pp. 520–527. Springer, Heidelberg (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ye, X., Xie, J., Wang, J., Tang, H., Du, N. (2009). An Approach of Performance Evaluation in Authentic Database Applications. In: Nambiar, R., Poess, M. (eds) Performance Evaluation and Benchmarking. TPCTC 2009. Lecture Notes in Computer Science, vol 5895. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10424-4_18
Download citation
DOI: https://doi.org/10.1007/978-3-642-10424-4_18
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-10423-7
Online ISBN: 978-3-642-10424-4
eBook Packages: Computer ScienceComputer Science (R0)