Abstract:
The Shuffle module is one of the core modules in Spark platform, its performance directly influences the performance and throughput of the whole Spark platform. The exist...Show MoreMetadata
Abstract:
The Shuffle module is one of the core modules in Spark platform, its performance directly influences the performance and throughput of the whole Spark platform. The existing memory scheduling algorithm for the Shuffle process only equitably allocates tasks according to the number of tasks without considering the different memory requirements of different tasks, which causes memory utilization to drop and low running efficiency when data is skewed. To solve this problem, one self-adaptive memory scheduling algorithm for the Shuffle process (SAMSAS) is proposed in this paper, which does not need to set the priority of task processing in advance. Instead, it can adjust memory allocation self-adaptively through constantly monitoring and learning the actual memory requirements of task execution. The experimental results show that SAMSAS algorithm can improve the utilization rate of the entire memory pool and the running efficiency of each Task, and specially it can effectively improve the running efficiency of Spark platform when processing skew data.
Date of Conference: 10-13 December 2018
Date Added to IEEE Xplore: 24 January 2019
ISBN Information: