Advancing Software-Defined Service-Centric Networking Toward In-Network Intelligence | IEEE Journals & Magazine | IEEE Xplore

Advancing Software-Defined Service-Centric Networking Toward In-Network Intelligence


Abstract:

Recent advances separately in edge intelligence, service-centric networking, and software-defined networking have profoundly impacted future networks and intelligent appl...Show More

Abstract:

Recent advances separately in edge intelligence, service-centric networking, and software-defined networking have profoundly impacted future networks and intelligent applications. Nevertheless, they have not been jointly considered. Jointly considering these three remarkable techniques will open up a nascent research direction. In this article, we propose a software-defined service-centric networking (SDSCN) framework to push the frontier of edge intelligence toward general in-network intelligence. We devise a three-plane architecture, including a data plane, a management plane, and a control plane. Specifically, in-network context-aware computing and caching capabilities are incorporated into the service-centric data plane. Network programmability and global orchestration provide the potential to design plenty of useful network applications to customize network behaviors flexibly. How in-network federated learning works in SDSCN can be observed from case studies and is validated by emulation. More simulation results show the outperformance of SDSCN. Several challenges in developing the proposed framework are presented and discussed.
Published in: IEEE Network ( Volume: 35, Issue: 5, September/October 2021)
Page(s): 210 - 218
Date of Publication: 15 September 2021

ISSN Information:

Funding Agency:


Introduction

Recent advances in computing, caching, and communication have fueled a plethora of innovations, which have profound impact on the development of artificial intelligence (AI) applications, such as facial recognition, speech understanding, gaming, and industrial automation. Considering the long distance between the cloud and edge data, edge computing becomes a competitive computational paradigm for delay-sensitive AI applications. Pushing the resource-intensive AI frontier to the edge ecosystem, however, is highly challenging due to concerns about performance and costs. Recent breakthroughs in lightweight AI methods have given rise to a new research area, namely, edge intelligence (EI) [1], which makes the most of widespread edge resources to gain AI insight. Many enabling technologies have been widely utilized for model training (e.g., federated learning and DNN splitting) and inference (e.g., model partition and model compression) at edge devices (e.g., base stations, vehicles, and drones) [1].

Contact IEEE to Subscribe

References

References is not available for this document.