Abstract:
Undoubtedly, Cloud computing is the ongoing trend that robbed the widespread concern. In fact, it succeeded in discharging users from computational and storage burden to ...Show MoreMetadata
Abstract:
Undoubtedly, Cloud computing is the ongoing trend that robbed the widespread concern. In fact, it succeeded in discharging users from computational and storage burden to offer them as services following pay-as-you-go principle. The substantial feature of Cloud is the ability to scale up with the rise of users' demands. However, Cloud performance falls into the duty of scaling up to fulfill the user requirements and the dramatically increase in energy consumption dilemma. In order to establish a certain trade-off, server consolidation tries to tackle this challenge by maximizing resources utilization per server in order to minimize the number of active servers. In fact, with the ever-growing demands on cloud services, cloud providers must be able to manage highly fluctuating workloads and avoids SLA violation. In this paper we are concerned by two main levels: Front-end and back-end. The former exhibits the set of users that utilize cloud services and the later is materialized by data centers where the factual load is carry out. It's significant to say that the front-end users are the leading responsible of the workload shape within data centers. We point out this fact in an attempt to further investigate the server workload within real time angle. The main goal of this paper is to formalize the server load according to users' behavior in term of submitted tasks and submission rate and to apply stream mining techniques as an introductory step to build a real time prediction system.
Date of Conference: 11-13 January 2017
Date Added to IEEE Xplore: 17 April 2017
ISBN Information: