Caching for Mobile Social Networks with Deep Learning: Twitter Analysis for 2016 U.S. Election | IEEE Journals & Magazine | IEEE Xplore

Caching for Mobile Social Networks with Deep Learning: Twitter Analysis for 2016 U.S. Election


Abstract:

As the rise of the portable devices, people usually access the social media such as Twitter and Facebook through wireless networks. Therefore, data transmission rates sig...Show More

Abstract:

As the rise of the portable devices, people usually access the social media such as Twitter and Facebook through wireless networks. Therefore, data transmission rates significant important to the end users. In this work, we discuss the problem of context-aware data caching in the heterogeneous small cell networks to reduce the service delay and how the device-to-device (D2D) and device-to-infrastructure (D2I) improve the system social welfare. In the data-caching model, we explore three types of cache entities, macro cell base stations, small cell base stations, and end user devices. We propose a long short-term memory (LSTM) deep learning model to perform data analysis and extract information content from the data. By knowing the interest of the data to the cache entities, we can cache the data that will most likely to be requested by the end users to reduce service latency. In simulation, we show our proposed algorithm can efficiently reduce the service latency during 2016 U.S. presidential election where mobile user were urgent to request the election information through wireless networks. Comparing with other mechanisms such as using one-to-many matching algorithm or without D2D communication technology, our proposed algorithm improves significantly on the devices performance and system social welfare.
Published in: IEEE Transactions on Network Science and Engineering ( Volume: 7, Issue: 1, 01 Jan.-March 2020)
Page(s): 193 - 204
Date of Publication: 01 May 2018

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.