Abstract:
The power consumption of IT equipment is always being a big challenge for data centers. The research communities are attempting to solve this problem, by employing variou...Show MoreMetadata
Abstract:
The power consumption of IT equipment is always being a big challenge for data centers. The research communities are attempting to solve this problem, by employing various of energy-aware tasks scheduling or VMs consolidation policies. It is crucial for these policies to figure out the major parameters that affect power consumption and their correlation. In this paper, we first identify the major parameters using real cloud services' workload and power consumption data. An important observation is that the parameters are strongly interdependent and the importance of individual parameters varies in different cloud services. Nevertheless, existing power consumption models are unable to fully capture this feature and thus lack generalization. To address this gap, we propose Lapem that adopts a Long Short-term Memory network for power consumption estimation. Lapem further uses an attention mechanism to achieve stable performance and improve generalization. The experimental results demonstrate that Lapem estimates power consumption with a relative error of as low as 2%-5%. More importantly, in comparison with the state-of-the-art models, Lapem reduces the estimation error by more than 23% when generalizing to new cloud services.
Date of Conference: 20-24 May 2019
Date Added to IEEE Xplore: 15 July 2019
ISBN Information: