A New Design Framework on D2D Coded Caching with Optimal Rate and Less Subpacketizations | IEEE Conference Publication | IEEE Xplore

A New Design Framework on D2D Coded Caching with Optimal Rate and Less Subpacketizations

Publisher: IEEE

Abstract:

In this paper, we propose a new design framework on Device-to-Device (D2D) coded caching networks with optimal communication load (rate) but significantly less file subpa...View more

Abstract:

In this paper, we propose a new design framework on Device-to-Device (D2D) coded caching networks with optimal communication load (rate) but significantly less file subpacketizations compared to that of the well-known D2D coded caching scheme proposed by Ji, Caire and Molisch (JCM). The proposed design framework is referred to as the Packet Type-based (PTB) design, where each file is partitioned into packets according to their pre-defined types while the cache placement and user multicast grouping are based on the packet types. This leads to the so-called raw packet saving gain for the subpacketization levels. By a careful selection of transmitters within each multicasting group, a so-called further splitting ratio gain of the subpacketizatios can also be achieved. By the joint effect of the raw packet saving gain and the further splitting ratio gain, an order-wise subpacketization reduction can be achieved compared to the JCM scheme while preserving the optimal rate. In addition, as the first time presented in the literature according to our knowledge, we find that unequal subpacketizaton is a key to achieve subpacketization reductions when the number of users is odd. As a by-product, instead of directly translating shared link caching schemes to D2D caching schemes, at least for the sake of subpackeitzation, a new design framework is indeed needed.
Date of Conference: 21-26 June 2020
Date Added to IEEE Xplore: 24 August 2020
ISBN Information:

ISSN Information:

Publisher: IEEE
Conference Location: Los Angeles, CA, USA

References

References is not available for this document.