Abstract:
Garbage collection (GC), an essential operation for flash storage systems, causes long tail latency which is one of the key problems in real-time and quality-critical sys...Show MoreMetadata
Abstract:
Garbage collection (GC), an essential operation for flash storage systems, causes long tail latency which is one of the key problems in real-time and quality-critical systems. In this article, we take advantage of reinforcement learning (RL) to reduce long tail latency. Especially, we propose two novel techniques which are: 1) Q-table cache (QTC) and 2) Q-value prediction. The QTC allows us to utilize appropriate and frequently recurring key states at a small memory cost. We propose a neural network called Q-value prediction network (QP Net) that predicts the initial Q-value of a new state in the QTC. The integrated solution of QTC and QP Net enables us to benefit from both short-term (by QTC) and long-term (by QP Net) history of system behavior to reduce the long tail latency. The experimental results demonstrate that the proposed scheme offers significant (by 25%-37%) reductions in the long tail latency of storageintensive workloads compared with the state-of-the-art solution that adopts an RL-assisted GC scheduler.
Published in: IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems ( Volume: 39, Issue: 10, October 2020)