Abstract
With the rapid growth in information technology, there is a significant increase in research activities in the field of cloud computing. Cloud testing can be interpreted as (i) testing of cloud applications, which involves continuous monitoring of cloud application status to verify Service Level Agreements, and (ii) testing as a cloud service which involves using the cloud as a testing middleware to execute a large-scale simulation of real-time user interactions. This study aims to examine the methodologies and tools used in cloud testing and the current research trends in cloud computing testing.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Cloud computing has different characteristics which are varied based on the area of the providing service, but the most important ones can be categorized as follows.
-
On-demand service: Customer can request for resources on a flexible short-term basis and get charged based on the utility of resources [3].
-
Rapid elasticity: Customer can rapidly request for more resources or release unnecessary resources as needed [4].
-
Virtualization: Customer can enjoy the portability and versatility of cloud when the lower level hardware is abstracted [4].
-
High reliability: Customer can execute time-consuming operations on the cloud because the cloud is built to be fault tolerant [5].
The cloud services are usually classified into three major sections as infrastructure, platform and software, however, Testing-as-a-Service (TaaS) is recently known as a new and demanding type of service model.
-
Infrastructure-as-a-Service (IaaS): Customer can acquire various computing hardware, including data storage, network devices, and processing power [1].
-
Platform-as-a-Service (PaaS): Customer can access a set of developer environments, including operating systems, database, and web server [4].
-
Software-as-a-Service (SaaS): Customer can access software applications through a client interface such as the web browser without the complexity of installations [4].
-
Testing-as-a-Service (TaaS): Customer can access test simulation environments and monitor complex program behaviors continuously [6].
Much research being conducted on cloud testing and most notable published studies include Testing Scope in Cloud, [7], cloud testing review [8], Integrated TaaS Platform for Mobile Development [9], A Systematic Mapping Study of Empirical Studies on Software Cloud Testing Methods [10], Research on Testing Software for Rapid Cloud Deployment [11], Cloud API Testing [12].
Cloud testing can have multiple interpretations: testing of cloud applications or testing as a cloud service. In general, Service Level Agreement (SLA) is a contract between cloud customer and cloud provider which describes the required quality of service (QoS) and the penalties for QoS violations. Adequate SLA monitoring can be of interest to both end users and cloud providers [13]. From the end user perspective, it is helpful to know which the best eligible cloud provider is, based on systematic testing of attributes such as reliability, performance, and security. Many researchers have investigated in this area. For example, Wagle et al. [14] proposed an evaluation model for ranking commercially available cloud providers using an ordered performance heat map. From the cloud provider perspective, it is important to monitor the health of the cloud system and to avoid the penalties for QoS violations. However, according to [15] testing of the cloud is a non-trivial task that requires continuous measurements of cloud application status and effective monitoring across multiple layers of the cloud stack.
As modern web applications are becoming more complex, traditional in-house testing also becomes insufficient in several ways. In-house testing facilities need to be carefully configured and maintained, and there is a significant amount of wastage when these facilities are in the idle state. Furthermore, as discussed in [4] in-house testing facilities are usually not powerful enough to simulate real-world concurrent user traffic. Due to the increasing awareness of testing as a cloud service, many businesses [16] have started to consider the trade-offs between keeping their existing testing facilities and migrating testing activities to the cloud. In fact, using the cloud as a testing middleware according to [2] promotes multi-scenario testing evaluation due to distributed testing and offers a large-scale simulation of real-time user interactions. In order to further understand the benefits of testing as a cloud service, it is useful to explore how testing environments are constructed in the cloud.
2 Cloud Services and Testing
This section describes some published methodologies on testing of cloud and testing as a cloud service, where the methodologies are categorized based on the type of testing, the type of cloud service, and the evaluation method. Table 1 lists the reference numbers of research papers according to the type of cloud service and the type of testing and Fig. 1 illustrates the breakdown of research papers according to the evaluation methods.
2.1 Types of Testing
Functionality.
Testing: The key focus of functionality testing is to reveal deviations between software’s intended and actual functionality. Gao et al. [17] developed a cloud-based testing as a service (CTaaS) system with a UI-layer, test space layer, and TaaS layer. Their system includes a pool of GUI-based test scripts from which cloud customers can select and obtain test simulation results after connecting to a GUI- based tool, such as Selenium. The approach that was suggested by Rosiello et al. [18] combines black box testing and the technique of fault injection. The authors introduce faults into the system under test by killing selected server threads and observing the output, e.g. the number of successful server responses, to see if there are any inconsistencies in system behavior.
Performance Testing.
The key focus of performance testing is to determine software’s responsiveness and effectiveness with numerical measurements. Nasiri et al. [19] described a multi-component framework consisting of an application analyzer to find the testable region in source code, a test case generator using SoapUI, and a cloud-based test executor. With this framework, they collected the mean execution times of activities in testing and discovered that the testbed startup time can be reduced by simultaneously initializing the system under test (SUT) master node and the test system machine.
A cloud-based testing as a service (CTaaS) system introduced in [17], includes a UI-layer, test space layer, and TaaS layer. The CTaaS includes a pool of performance test scripts, automatic test script generation using workflow graphs, and SaaS performance metrics collection using Amazon’s EC2 CloudWatch APIs. The study was further supported by [20] declarative testing environment consisting of Crawl, a domain-specific language for defining performance test scenarios, and Crawler, a Java-based cloud engine for executing test scenarios and collecting the results. Others [13] created multi-module framework consisting of one design time module and four runtime modules. One of the runtime modules is responsible for computing response time, throughput, and reliability in order to detect SLA violations. The further study [21] of a mobile application testing framework that takes Android application as input, records application execution using a crawler, and output performance metrics of response time, throughput, and network latency.
A model-based approach was suggested by [22] for performance testing. The authors use a sequential action model to define the cloud services to be tested and a load model to define parameters such as test execution duration and number of simulated users. To this end, the further work [23] describes a framework to conduct performance testing on cloud applications. Their framework is based on identifying operational modes and actors, calculating occurrence probability of common operations, and measuring performance in terms of transaction time and page load time. Furthermore, Ghosh et al. [24] proposed three stochastic models (resource provisioning decision model, VM provisioning model, and run-time model) to describe the inner details of a cloud service. Based on these models, the authors also introduced formulas to calculate QoS metrics of job rejection probability and mean decision delay, which are indicators of system availability and response time.
Elasticity and Scalability Testing.
The key focus of elasticity and scalability testing is to determine the cloud’s adapting ability when the demand for resources changes. A perfectly elastic cloud can instantaneously scale resources up or down according to demand. However, as described in [25] in reality, there is always a delay between the time when a load change is detected and the time when resource configuration is changed accordingly. Further work on real-time elasticity testing by Albonico et al. [26] includes a framework with three different elasticity states (i) a scaling in the state when a resource is released, (ii) a scaling out state when a resource is acquired, and (iii) a ready state. For each elasticity state, they executed 2500 operations per second for a fixed period of time and recorded the number of operations that are successful. The authors used these recordings to identify problems during each elasticity state. Further research by Tsai et al. [27] reveals a framework that uses feature selection algorithms to identify bottleneck of an application and uses association rule to identify significant relationships between scalability and various parameters, such as a number of concurrent users.
Security Testing.
The key focus of security testing is to reveal security leaks that make the software vulnerable to attacks. Cotroneo et al. [28] described a framework to identify the root cause of security alerts through a conceptual clustering approach. Their framework is valuable to testing because large volumes of security issues can be detected and classified automatically. Furthermore, an identity-based cloud data integrity checking protocol [29] that consists of six algorithms and can be used for auditing variable-sized data blocks. A framework described by [30] consists of a front-end module that uses the Model-View-Controller pattern, a test environment module that prepares tools and virtual machines, and a testing module that handles test scripts. To achieve security testing, the authors integrated Metasploit, a testing tool that offers security vulnerability scanning, into the test environment module.
A multi-component framework described by [31] consisted a test scheduler for allocating resources, a test controller for checking the status of security scanners, and several other components. The authors mentioned a set of security scanners, including IBM AppScan and HP WebInspect that can be used to identify vulnerabilities.
Automatic Testing.
The automation of test case generation is another widely explored aspect of cloud testing. A framework proposed by [32] where semantic information in the form of Web Service Description Language (WSDL) is generated from source code and comments, event sequence graphs are generated from WSDL information, and test cases are generated from event sequence graphs. Further research conducted by [33] describes an architecture called Expertus to automate testing of large-scale applications in the cloud. Expertus uses two templates to generate and modify cloud resources and a multi-stage, XML-based code generator to systematically generate complex test scenarios. An interesting research work by [34] reveals layered parameterized tests that use layered symbolic execution (LSE) to dynamically automate test script generation. The LSE algorithm explores the path- based execution tree of an application and obtains inputs for testing by negating constraints in the path conditions. Based on cloud computing, [59] developed a multi-layered model of a software online testing platform which integrates IaaS and SaaS platforms as an automatic self-help service portal for users and operation maintenance portal for administrators.
2.2 Testing Cloud Service
Infrastructure as a Service.
There are multiple articles published in this area each with its own unique and significant contributions. A declarative testing environment that can execute tests in multiple IaaS clouds was discussed by [20]. Others including [35] suggested using a third-party auditor to detect SLA violations of virtual machine CPU speed. The third-party auditor (TPA) must have its own timing functions and minimal communication overhead. Hence SLA violations can be detected by comparing the execution times on the cloud provider’s virtual machine and on the TPA’s virtual machine. Yet [29] describes a protocol for data integrity checking on IaaS cloud. The advantage of IaaS over traditional server was demonstrated [23] by deploying the same application on both infrastructures and comparing their performance under high load conditions. This was supported by a general description of IaaS cloud using three stochastic models [24], where some areas more than one of these cloud services are needed as in the autonomous neural network-based 3D positioning systems [55] which are very useful in rescue operations and need very reliable software and testing services.
Platform as a Service.
Model-based approaches including [22, 34] that are used for testing enterprise PaaS cloud, are good at automating test script generation, making the testing process more efficient. Other proposed framework [34] that uses layered symbolic execution (LSE) algorithm to automate testing for PaaS applications.
Software as a Service.
When testing SaaS pairwise testing, a form of black box testing, seems to be one of the best techniques [36, 58] to reduce the number of test cases when testing SaaS. Others [27, 28, 32] proposed framework to classify security alerts in SaaS cloud, automatically generate test cases that are based on Service Oriented Architecture, to test the scalability of SaaS applications [27]. Based on the MVC design pattern, [57] studied the construction and robustness testing of SaaS cloud computing data center.
Testing as a Service.
According to [37] the testing of mobile applications is a complex task due to the rapidly increasing number of different devices, operating systems, runtime environments, and network providers. TaaS can effectively address these difficulties. Prathibhan et al. [21] proposed a mobile application testing tool that serves as an interface between the user’s device and the cloud. Their tool systematically goes through a set of device emulators, where for each emulator, the execution of Android application is recorded as UI events, which are used to generate test scenarios. The proposed framework [37] consisted of multiple components: a cloud controller for administrative management, a hypervisor for virtual machine coordination, a mobile emulator manager for running different Android versions, a test automation manager for running GUI-based test cases was also implemented successfully.
In another study, the authors described a cloud-based framework [30] for conducting security tests on a set of mobile applications on the Android 4.0 OS. This work was further supported by [38] by the development of a framework for setting up large-scale emulator based mobile testing on the OpenStack cloud platform. The authors’ framework is composed of a request loader for generating mobile service requests, a resource provisioning engine for running load balancing algorithms, a graphical service monitor for displaying resource utilization measurements and request processing time.
2.3 Evaluation Methods
Case Studies.
Many authors have evaluated their methodologies by deploying their methodologies in real-world cloud systems or using real-world datasets as input. The authors selected Sunflow [19], an image rendering application, as the system under test (SUT) and FOSS-Cloud, an open source IT environment, as the runtime cloud provider. Others [17] selected OrangeHRM, an open source SaaS application as the SUT, and Amazon EC2 as the cloud infrastructure on top of which their TaaS system was built. Furthermore, the authors of [22] selected Olio, an open source social network application, as the SUT, and two public IaaS cloud providers, Amazon EC2 and Rackspace. Selection of OpenStack [13, 37] as the runtime cloud provider was an interesting approach. Mean response time was computed to detect related SLA violations and to determine the overhead of their proposed test system, the runtime cloud environment, and five mobile apps from Google Play Store. By inputting two datasets of security alerts generated from a production SaaS Cloud into their framework, the authors [28] evaluated the effectiveness of the framework for classifying alerts in the cloud.
The authors [30] describe selection of 100 mobile applications with varying vulnerabilities as input to their testing system and determined for their testing system correctly outputs the number of vulnerable applications, this work was further explored by [23] where, the authors selected a university-based social network application as the system under test and Azure Cloud as the runtime cloud provider. In a similar way others [26] selected Amazon EC2 as the runtime cloud provider and Yahoo Cloud Serving Benchmark, connected to MongoDB, as the system under test [33] selected a set of runtime providers, including Emulab, Amazon EC2, Open Cirrus, Wipro, and Elba, and a set of system of test, including RUBBoS, RUBiS, and Cloudstone. By deploying their framework on different cloud providers and for different systems, the authors [27] showed the richness of their solution to automated testing. Others [31] implemented their security testing system on the Microsoft SQL Server and used it to detect defects in 456 selected web applications.
Prototype Development.
A prototype system developed by [32] has the typical features of online banking was used to show how well their testing framework adapted to the prototype system. Others [34] evaluated their work based on how quickly test cases can be generated on this prototype system by implementing a prototype system on a virtual machine that includes all necessary PaaS components, such as language interpreter and operating system. They
Formal Proof.
To demonstrate the soundness of their proof, the authors used a formal proof [29] where, they proved that attacking their protocol through signature forgery has the same complexity as solving the RSA problem with large exponents.
Survey.
Conducting a survey of an experienced QA tester team showed to be effective to determine the effectiveness of the authors [22] proposed CLTF framework. The tester respondents wrote down the time they need to test fifty cloud services separately using CLTF and using JMeter, a popular open source testing tool. Based on these responses, the authors proved that their framework is better than JMeter in terms of time cost.
Other Studies.
Further studies were conducted in cloud computing including the introduction of a framework [56] to evaluate the risk involved in selecting a service provider, where the authors proposed a framework to evaluate the risk for various service providers in term of customers’ priorities. This framework helps customers of the cloud services to choose the providers with the best Service Level Agreement.
3 Discussion and Results
The research papers reviewed in this paper were obtained from electronic databases and were selected based on the title, abstract, and conclusion. In general, there is an uneven distribution of research effort within the area of cloud testing. This is demonstrated in Table 1, where it shows distribution according to the type of cloud services and testing.
From Fig. 1, it is clear that the majority of authors have demonstrated the usability of their proposals through real-world case studies indicating that real-world case study is the most convincing type of evaluation method in the research community.
4 Conclusion
As the field of cloud computing evolves, more research is needed to address the new challenges in the testing of cloud application and testing as a cloud service. This study contributes to a better understanding of the current trend of cloud testing research. Especially, testing of Platform-as-a-Service is a subarea that needs more investigations. Right now, there is no standard toolset for cloud testing. For future work, it may be worthwhile to investigate the different aspects of various commercialized testing tools through a comparative study. Nachiyappan et al. [39] have already compared five publicly available cloud testing products in terms of their pros, cons, and pricing model. There are more available products that can be incorporated into a comparative study and more aspects that need to be considered, including the scripting environment and API support for each testing product.
References
Kuyoro, S.O., Ibikunle, F., Awodele, O.: Cloud computing security issues and challenges. Int. J. Comput. Netw. 3(5), 247–255 (2011)
Fang, W., Xiong, Y.: Cloud testing the next generation test technology. In: International Conference on Electronic & Measurement, Chengdu, pp. 291–295 (2011)
Cai, J., Hu, Q.: Analysis for cloud testing of web application. In: International Conference on Systems and Informatics (ICSAI), pp. 293–297 (2014)
Inçki, K., Ari, I., Sözer, H.: A survey of software testing in the cloud. In: IEEE International Conference on Software Security and Reliability Companion, pp. 18–23 (2012)
Wang, W., Wang, B., Huang, J.: Cloud computing and its key techniques. In: IEEE International Conference on Computer Science and Automation Engineering, pp. 404–410 (2011)
Harikrishna, P., Amuthan, A.: A survey of testing as a service in cloud computing. In: International Conference on Computer Communication and Informatics, pp. 1–5 (2016)
Murthy, M.S.N., Suma, V.: Software testing and its scope in CLOUD: a detailed survey. In: International Conference on Innovative Mechanisms for Industry Applications, pp. 269–273 (2017)
Vilkomir, S.: Cloud testing: a state-of-the-art review. Int. J. Inf. Secur. 28(17), 213–222 (2012)
Starov, O., Vilkomir, S.: Integrated TaaS platform for mobile development: architecture solutions. In: Proceedings of the Eighth International Workshop on Automation of Software Testing, pp. 18–19 (2013)
Al-Said, A.A., Brereton, P., Andras, P.: A systematic mapping study of empirical studies on software cloud testing methods. In: IEEE International Conference on Software Quality, Reliability and Security Companion (2017)
Chen, Y., Huang, J., Ji, X.: Research on testing software for rapid cloud deployment. In: International Conference on Electronics and Information Engineering (2017)
Wang, J., et al.: Cloud API testing. In: IEEE International Conference on Software Testing, Verification and Validation Workshops, pp. 385–386 (2017)
Grati, R., Boukadi, K., Ben-Abdallah, H.: A framework for IaaS-to-SaaS monitoring of BPEL processes in the cloud: design and evaluation. In: IEEE/ACS International Conference on Computer Systems and Applications, pp. 557–564 (2014)
Wagle, S.S., Guzek, M., Bouvry, P., Bisdorff, R.: An evaluation model for selecting cloud services from commercially available cloud providers. In: IEEE International Conference on Cloud Computing Technology and Science, pp. 107–114 (2015)
Alhamazani, K., et al.: An overview of the commercial cloud monitoring tools: research dimensions, design issues, and state-of-the-art. Computing 97(4), 357–377 (2015)
Riungu-Kalliosaari, L., Taipale, O., Smolander, K., Richardson, I.: Adoption and use of cloud-based testing in practice. Softw. Qual. J. 24(2), 337–364 (2016)
Gao, J., et al.: A cloud-based TaaS infrastructure with tools for SaaS validation, performance and scalability evaluation. In: IEEE International Conference on Cloud Computing Technology and Science, pp. 464–471 (2012)
Rosiello, S., Choudhary, A., Roy, A., Ganesan, R.: Combining black box testing with white box code analysis: a heterogeneous approach for testing enterprise SaaS applications. In: International Symposium on Software Reliability Engineering, pp. 359–364 (2014)
Nasiri, R., Hosseini, S.: A case study for a novel framework for cloud testing. In: International Conference on Electronics, Computer and Computation, pp. 1–5 (2014)
Cunha, M., Mendonca, N., Sampaio, A.: A declarative environment for automatic performance evaluation in IaaS clouds. In: International Conference on Cloud Computing (CLOUD), pp. 285–292 (2013)
Prathibhan, C.M., Malini, A., Venkatesh, N., Sundarakantham, K.: An automated testing framework for testing Android mobile applications in the cloud. In: International Conference on Advanced Communication Control and Computing Technologies, pp. 1216–1219 (2014)
Zhou, J., Zhou, B., Li, S.: Automated model-based performance testing for PaaS cloud services. In: International on Computer Software and Applications, pp. 644–649 (2014)
Cico, O., Dika, Z.: Performance and load testing of cloud vs. classic server platforms (case study: social network application). In: Mediterranean Conference on Embedded Computing, pp. 301–306 (2014)
Ghosh, R., Longo, F., Naik, V.K., Trivedi, K.S.: Modeling and performance analysis of large scale IaaS clouds. Futur. Gener. Comput. Syst. 29(5), 1216–1234 (2013)
Brebner, P.C.: Is your cloud elastic enough?: performance modelling the elasticity of infrastructure as a service (IaaS) cloud applications. In: ACM/SPEC International Conference on Performance Engineering, pp. 263–266 (2012)
Albonico, M., Mottu, J., Sunyé, G.: Monitoring-based testing of elastic cloud computing applications. In: International Conference on Performance Engineering, pp. 3–6 (2016)
Tsai, W.T., Huang, Y., Shao, Q.: Testing the scalability of SaaS applications. In: IEEE International Conference on Service-Oriented Computing and Applications, pp. 1–4 (2011)
Cotroneo, D., Paudice, A., Pecchia, A.: Automated root cause identification of security alerts: evaluation in a SaaS cloud. Futur. Gener. Comput. Syst. 56, 375–387 (2016)
Yu, Y., et al.: Cloud data integrity checking with an identity-based auditing mechanism from RSA. Futur. Gener. Comput. Syst. 62, 85–91 (2016)
Tao, D., Lin, Z., Lu, C.: Cloud platform based automated security testing system for mobile Internet. Tsinghua Sci. Technol. 20(6), 537–544 (2015)
Tung, Y.H., Lin C.C., Shan, H.L.: Test as a service: a framework for web security TaaS service in cloud environment. In: IEEE International Symposium on Service Oriented System Engineering, pp. 212–217 (2014)
Wu, C.S., Lee, Y.T.: Automatic SaaS test cases generation based on SOA in the cloud service. In: IEEE International Conference on Cloud Computing Technology and Science, pp. 349–354 (2012)
Jayasinghe, D., et al.: Expertus: a generator approach to automate performance testing in IaaS clouds. In: IEEE International Conference on Cloud Computing (CLOUD), pp. 115–122 (2012)
Bucur, S., Kinder, J., Candea, G.: Making automated testing of cloud applications an integral component of PaaS. In: 4th Asia-Pacific Workshop on System (2013)
Houlihan, R., Du, X., Tan, C.C., Wu, J. Guizani, M.: Auditing cloud service level agreement on VM CPU speed. In: IEEE International Conference on Communications (ICC), pp. 799–803 (2014)
Silva, A.C.D., Correa, L.R., Dias, L.A.V., Cunha, A.M.D.: A case study using testing technique for software as a service (SaaS). In: International Conference on Information Technology - New Generations, pp. 761–762 (2015)
Villanes, I.K., Costa, E.A.B. Dias-Neto, A.C.: Automated mobile testing as a service. In: IEEE World Congress on Services, pp. 79–86 (2015)
Tao, C., Gao, J., Li, B.: Cloud-based infrastructure for mobile testing as a service. In: International Conference on Advanced Cloud and Big Data, pp. 133–140 (2015)
Nachiyappan, S., Justus, S.: Cloud testing tools and its challenges: a comparative study. Big Data Cloud Comput. Chall. Procedia Comput. Sci. 50, 482–489 (2015)
Ganis, G., Panitkin, S.: Evaluating Google compute engine with PROOF. In: International Conference on Computing in High Energy and Nuclear Physics (2013)
Hwang, G.H., Wu-Lee, C., Tung, Y.H., Chuang, C.J., Wu, S.F.: Implementing TaaS-based stress testing by MapReduce computing model. In: IEEE International Conference on Software Engineering and Service Science, pp. 137–140 (2014)
Gao, Q., Wang, W., Wu, G., Li, X., Wei, J., Zhong, H.: Migrating load testing to the cloud: a case study. In: IEEE International Symposium on Service Oriented System Engineering, pp. 429–434 (2013)
Malini, A., Venkatesh, N., Sundarakantham, K., Mercyshalinie, S.: Mobile application testing on smart devices using MTAAS framework in cloud. In: International Conference on Computer and Communications Technologies, pp. 1–5 (2014)
Kim, T., et al.: Monitoring and detecting abnormal behavior in mobile cloud infrastructure. In: IEEE Network Operations and Management Symposium, Maui, pp. 1303–1310 (2012)
Salah, K., Al-Saba, M., Akhdhor, M., Shaaban O., Buhari, M.I.: Performance evaluation of popular Cloud IaaS providers. In: International Conference on Internet Technology and Secured Transactions, pp. 345–349 (2011)
Gao, J. Pattabhiraman, P., Bai, X., Tsai, W.T.: SaaS performance and scalability evaluation in clouds. In: International Symposium on Service Oriented System Engineering, pp. 61–71 (2011)
Cotroneo, D., Frattini, F., Pietrantuono, R., Russo, S.: State-based robustness testing of IaaS cloud platforms. In: 5th International Workshop on Cloud Data and Platforms (2015)
Mathew, R., Spraetz, R.: Test automation on a SaaS platform. In: International Conference on Software Testing Verification and Validation, pp. 317–325 (2009)
Llamas, R.M., et al.: Testing as a service with HammerCloud. In: International Conference on Computing in High Energy and Nuclear Physics (2013)
Kuo, J.Y., Liu, C.H., Yu, Y.T.: The study of cloud-based testing platform for Android. In: IEEE International Conference on Mobile Services, pp. 197–201 (2015)
Ramachandran, M., Chang, V.: Towards performance evaluation of cloud service providers for cloud data security. Int. J. Inf. Manag. 36(4), 618–625 (2016)
Ye, L., Zhang, H., Shi, J., Du, X.: Verifying cloud service level agreement. In: Global Communications Conference, pp. 777–782 (2012)
Gonçalves, G.D., et al.: Workload models and performance evaluation of cloud storage services. Computer Net- works (2016). http://dx.doi.org/10.1016/j.comnet.2016.03.024. Accessed 2016
Yan, M., Sun, H., Wang, X., Liu, X.: WS-TaaS: a testing as a service platform for web service load testing. In: IEEE 18th International Conference on Parallel and Distributed Systems, pp. 456–463 (2012)
Hedayati, H., Tabrizi, N.: MRSL: autonomous neural network-based 3-D positioning system. In: International Conference on Computational Science and Computational Intelligence, pp. 170–174 (2015)
Yadranjiaghdam, B., Komal, H., Tabrizi, N.: A risk evaluation framework for service level agreements. In: IEEE International Conference on Computer and Information Technology, pp. 681–685 (2016)
Zhang, S., Liu, Z.: Research on the construction and robustness testing of SaaS cloud computing data center based on the MVC design pattern. In: 2017 International Conference on Inventive Systems and Control (ICISC), pp. 1–4. IEEE, January 2017
El Nagdy, A.M., El Azim, M.A., AbdelRaouf, A.: A new framework of software testing using cloud computing for banking applications. J. Softw. 12(8), 657–664 (2017)
Chen, J., Wang, C., Liu, F., Wang, Y.: Research and implementation of a software online testing platform model based on cloud computing. In: 2017 Fifth International Conference on Advanced Cloud and Big Data (CBD), pp. 87–93. IEEE, August 2017
Bhushan, K., Gupta, B.B.: Hypothesis test for low-rate DDoS attack detection in cloud computing environment. Procedia Comput. Sci. 132, 947–955 (2018)
Acknowledgement
This research is supported in part by grants #1560037 from the National Science Foundation.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Yao, J., Maleki Shoja, B., Tabrizi, N. (2019). An Overview of Cloud Computing Testing Research. In: Da Silva, D., Wang, Q., Zhang, LJ. (eds) Cloud Computing – CLOUD 2019. CLOUD 2019. Lecture Notes in Computer Science(), vol 11513. Springer, Cham. https://doi.org/10.1007/978-3-030-23502-4_21
Download citation
DOI: https://doi.org/10.1007/978-3-030-23502-4_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-23501-7
Online ISBN: 978-3-030-23502-4
eBook Packages: Computer ScienceComputer Science (R0)