Elsevier

Computer Networks

Volume 173, 22 May 2020, 107182
Computer Networks

Tracking area list allocation scheme based on overlapping community algorithm

https://doi.org/10.1016/j.comnet.2020.107182Get rights and content

Abstract

Reducing the singling overhead for tracking and mobile paging devices is a challenging issue in the study of location management of cellular networks. Cellular networks have become massive generators of data, and in the forthcoming years, this data is expected to increase drastically. Big data-based intelligence and analytics can improve network operational efficiency and user service quality. This work proposes to exploit massive handover and paging data from cellular networks to minimize singling due to user mobility. In this paper, we offer a new holistic tracking area lists (TAL) management methodology, considering group user mobility behavior and paging characteristics. Firstly, a series of graphs showing the evolution of user mobility and traffic is built from handover and paging statistics in the network management system (NMS). Then, the TAL allocation problem is formulated as a classical graph partitioning problem, which is then solved by detecting overlapping communities algorithm based on game theory. Results show that the proposed method can effectively reduce the location management singling overhead and improve the TAL configuration efficiency.

Introduction

In the future 5G cellular network, a large number of mobile terminals will emerge, and the requirements for network capacity are significantly increased. There will be a large number of mobile devices in densely places such as indoor, shopping malls, and stadiums. At the same time, mobile terminals have different business needs, which leads to the diversification of business needs. In the future, hot-pot, small cellulars with self-organizing, low-cost, low-power characteristics will be densely deployed [1]. With the miniaturization of cellular size and higher user density and mobility, location update, and paging frequency will be significantly increased, resulting in huge location management signalling overhead [2]. Therefore, the location management strategy is required to optimize the signalling overhead caused by user movement and paging as much as possible to improve network performance.

Mobility management is one of the core procedures of cellular networks. It allows for tracking user equipment(UE) and deliver communication services in a seamless manner. In the long-term evolution (LTE) network, the tracking of users under the premise of reducing signalling overhead and optimizing network performance is necessary. Hence, the location management strategy is to divide the entire network area into tracking areas (TA). TA consisting of several cellulars, and each cellular only belong to one TA. To identify the user’s location, the LTE core network pages the latest tracking area that the user was registered. The paging signal is received by all of the cellulars that reside in the same tracking area. Additionally, the user will update the core network by sending a tracking area update signal once it moves from one tracking area to another [3], [4]. The concept of traditional TA has an advantage in minimizing paging overhead, but still has the following shortcomings:

  • The ‘ping-pong’ effect may result in more location update signalling overhead. Users frequently cross the generated frequent location updates between two neighboring cellulars belonging to different TA, which may be exacerbated in a densely deployed cellular environment.

  • Mobility signalling congestion caused by a large number of users with similar behavior. For example, a large number of users move from one TA to another.

  • The use of TA has a symmetry limitation: if two cellulars are in the same TA, they cannot be in any other TA.

LTE tackled this problem by introducing the concept of the tracking area list (TAL) [5]. The UE sends a tracking area update (TAU) message to a core network (CN), every time a user moves to a new location and connects to a new TA. On the other hand, when a connection request comes for a UE, the network sends a paging message to all TAs (i.e., TAL) where the UE is registered. An increase in TALs size leads to a rise in paging singling messages and a decrease in TAU singling messages. Fig. 1 shows a trade-off between TAU and paging overheads when forming TALs. In the figure, we assume that the network contains three TAs along a railway path, in which each TA has two other neighboring TAs on the left and the right sides. From Fig. 1(a), we observe that the organization of each TA in a separate TAL causes many TAU singling messages in the network, which are generated and forwarded from the tracking area to CN whereas Fig. 1(b) and (c) shows that increasing TAL size reduces TAU overhead and increases paging overhead. Fig. 1(c) shows that the TAU overhead can be ignored if all TAs are organized in the same TAL. Since different UEs can be configured with various lists, the singling associated with updates is distributed among several cellulars, and area borders become blurred. Also, the newly allocated list may be overlapped with the previous one. This solves the problem of frequent updates due to UEs moving in the border between two TAs or due to sharp changes in channel conditions.

Several research works have been conducted to solve the TAL problem. The aim is to capture the tradeoff that mitigates the overhead of TAU and paging messages when constructing and assigning TALs to UEs. Research on location update strategies started with a global system for mobile communications (GSM) systems and the optimization of its location areas [6], [7]. There has been extensive work on the matter, and two big strategies can be identified based on user status information and independent of user status information.

A tracking area configuration method based on the minimum signalling cost between adjacent regions is proposed in [8]. The main feature of this method is that when different mobile users are in the same TA, the TAL allocated to them will vary with the last registered TA. The method proposed in [9] is a relatively simple and effective method at present. The main idea of [9] is consistent with the process [8], but this method is more straightforward and faster in implementation. A technique to manage TAL is proposed in [10] by recording the time mobile users enter each tracking area. This method assigns the TAL as the reference information when the mobile user enters the TA. For mobile users, the TAL can be appointed pertinently to reduce the frequency of location updates. However, the shortcomings are also evident. This method is particularly unsuitable for users with high mobility, because every time a new TA is entered, the TA will be updated. A TAL allocation scheme based on local mobile users is proposed in [11]. This method can reduce the signalling overhead of location management for mobile users who stay in a specific area. However, for increasingly smaller cellular sizes and higher user density and mobility, the location update frequency of mobile users significantly increases. Consequently, the method is relatively weak.

A method was proposed in [12], which measures mobile users’ moving and paging characteristics between two consecutive sessions and obtain the optimal threshold of movement. The corresponding TAL is allocated for the optimal threshold value of each mobile user. A method was proposed in [13], which receives the recorded information in the update request message. The recorded information contained in the tracking area is sent by the mobile user for analysis and determine the direction and speed of user movement. A TAL of mobile users is designed according to the direction and rate of users’ mobility. A method of dynamically adjusting the TAL size by seamlessly supervising users’ motion state was proposed in [14]. A method was proposed in [21] that uses a bargaining game to ensure a fair tradeoff between TAU and paging signalling overhead (TA to TAL allocation scheme). Bagaa et al. [21] uses a cellular-to-TAL dynamic allocation scheme.

Game theory has been widely used as a convenient tool for the design of future wireless and communication networks in recent years [23]. In [16], it utilizes game theory to detect overlapping communities in social networks. Staudt and Meyerhenke [24] proposed a parallel tag propagation algorithm based on the idea of memory sharing, which achieved an excellent recognition effect on large-scale network data. Zhang et al. [25] designed a modular solution method and improved the update strategy of the Fast-Newman algorithm. It has improved the efficiency of community identification. The application of [16] in wireless networks has been verified, and the detection effect and effectiveness of [16] on large-scale data sets are relatively good.

Our work is carried out in the context of future 5G networks, in which small cell densification is a crucial factor in increasing network capacity. In this context, handling mobility is a very challenging issue. Idle mode mobility requires imaginative ideas and improved management so that signalling is kept minimized and far from congesting the network. Also, the Data mining algorithm is expected to play a critical role in 5G network self-optimization thanks to the availability of big data provided by users and network elements.

The TAL allocation scheme proposed by the above literature is weak in optimizing TAU and paging signalling overhead. The reason is that in the case of more significant cell density and user density, it has lower allocation efficiency, which leads to network channel congestion. In this paper, we collect handover and paging data from NMS and models the TAL assignment problem as a graph segmentation problem, which significantly simplifies the process of TAL allocation. Our paper also propose a TAL allocation scheme based on an overlapping community detection algorithm to segment the TAL partition model map. In our work, we have fully considered the user switching behavior and paging features to provide TAL services for users in the tracking area efficiently. Finally, simulation results show that the proposed TAL allocation scheme can effectively reduce the total signalling overhead and improve the TAL allocation efficiency.

The contributions of this study can be summarized as follows:

  • 1.

    The problem of TAL allocation is modelled as a classical graph partitioning problem. Which greatly simplifies the process of TAL allocation. Also, a tradeoff between TAU and paging overhead is taken into account.

  • 2.

    The partitioning problem is solved using a community detection algorithm in data mining, to realize the tradeoff between TAU and paging overhead better.

  • 3.

    The proposed game theory based overlapping community detection algorithm solves the TAL allocation problem. Compared with the introduced TAL allocation schemes, our scheme effectively reduces the signalling overhead and improves TAL allocation efficiency.

The rest of the paper is organized as follows. After the discussion of the introduction and related work in Section 1, we illustrate the system model in Section 2. Section 3 presents our proposed algorithm. The experimental simulation of our proposed method is described in Section 4. Finally, the paper is concluded in Section 5.

Section snippets

Cellular deployment model

With the development of small cell technology, the capacity of existing cellular networks has been dramatically increased. However, the number of cellular BSs is huge, with random deployment. The deployment of cellular BSs cannot be simulated with the traditional wrap-around model. Some scholars proposed to simulate the implementation of cellular BSs by using the homogeneous Poisson point process (HPPP) in stochastic geometry theory [15]. The HPPP model has been widely accepted and applied in

Tracking area lists allocation scheme

Complex networks are self-organizing, self-similar, and scale-free. In hot-pot environments, cellular deployments are random, which is similar to the nature of complex networks. The community structure is an essential feature in complex networks, as shown in Fig. 3, the TAL allocation method is regarded as an overlapping community detection method in a complex network. Different TALs (i.e., different communities) can contain the same TA (i.e., overlapping nodes in the community). No one in the

Experimental simulation

In this section, to evaluate the performance of the proposed method, experimental parameters and performance metrics are first given, and the results are presented later.

Conclusion

An important challenge in cellular networks is to cope with the amount of singling to be generated by location management, especially the dense deployment of small cellular base stations in hot-pot, which will increase the singling overhead of the network. The paper proposes a TAL management method based on overlapping community detection to reduce the singling overhead of location management. On the basis of the TA planning results, the TAL partitioning model will be classified into new

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

This work was partially supported by the National Key R&D Program of China (No.2018YFB0803600), National Natural Science Foundation of China (No.61801008), Beijing Municipal Natural Science Foundation (No.L172049), Scientific Research Common Program of Beijing Municipal Commission of Education (No.KM201910005025).

Shanshan Tu (M 17) received the Ph.D. degree from Beijing University of Posts and Telecommunications, Beijing, China, in 2014. From 2013 to 2014, he visited University of Essex, Colchester, U.K., for National Joint Doctoral Training. He worked with the Department of Electronic Engineering at Tsinghua University, Beijing, China, as a Postdoctoral Researcher from 2014 to 2016. He is currently an Assistant Professor with the Faculty of Information Technology, Beijing University of Technology,

References (25)

  • Ikeda et al.

    A tracking area list configuration method to mitigate burst location updates

    IEEE Fifth International Conference on Communications and Electronics IEEE

    (2014)
  • Wang et al.

    Modeling of tracking area list-based location update scheme in long term evolution

    IEEE International Conference on Communications IEEE

    (2014)
  • Cited by (5)

    • A revocable and outsourced multi-authority attribute-based encryption scheme in fog computing

      2021, Computer Networks
      Citation Excerpt :

      This is known as the outsourcing computation paradigm. Recently, the problem for securely outsourcing various expensive computations or storage has attracted considerable attention in the academic community [28,29]. In contrast, the authors of [30] proposed outsourcing decryption operations in attribute-based encryption technology.

    • A dynamic optimization algorithm for multi-target tracking area lists

      2023, Nanjing Youdian Daxue Xuebao (Ziran Kexue Ban)/Journal of Nanjing University of Posts and Telecommunications (Natural Science)

    Shanshan Tu (M 17) received the Ph.D. degree from Beijing University of Posts and Telecommunications, Beijing, China, in 2014. From 2013 to 2014, he visited University of Essex, Colchester, U.K., for National Joint Doctoral Training. He worked with the Department of Electronic Engineering at Tsinghua University, Beijing, China, as a Postdoctoral Researcher from 2014 to 2016. He is currently an Assistant Professor with the Faculty of Information Technology, Beijing University of Technology, Beijing, China. His research interests include cloud computing, MEC, and information security techniques

    Muhammad Waqas(M 18) received his B.Sc. and M.Sc. degrees from the Department of Electrical Engineering, University of Engineering and Technology Peshawar, Pakistan, in 2009 and 2014, respectively. From 2012 to 2015, he has also served Sarhad University of Science and Information Technology, Peshawar, Pakistan, as a Lecturer and program coordinator. Dr. Muhammad Waqas Pursued his Ph.D. degree (Sept. 2015 Jun. 2019) with the Beijing National Research Center for Information Science and Technology, Department of Electronic Engineering, Tsinghua University, Beijing, China. Currently, he is research associate in Beijing Key Laboratory of Trusted Computing, Faculty of Information Technology, Beijing China. He is also associated with the Faculty of Computer Science and Engineering, GIK Institute of Engineering Sciences and Technology, Pakistan. He has several research publications in reputed Journals and Conferences. His current research interests are in the areas of networking and communications including 5G networks, D2D communication resource allocation and physical layer security and information security, mobility investigation in D2D communication, Fog computing and MEC.

    Qiangqiang Lin received the B.Sc. degree from North University of China, TaiYuan, China in 2017. He is currently working toward the M.Sc. degree in Beijing University of Technology, Beijing, China. His research interests include pattern recognition and machine learning.

    Sadaqat ur Rehman received the B.Sc. Hons. degree from University of Engineering and Technology Peshawar, Peshawar, Pakistan, and the M.Sc. degree from the Sarhad University of Science and Information Technology, Peshawar, Pakistan, in 2011 and 2014, respectively. He is currently working toward the Ph.D. degree in NTN LAB at the Department of Electronic Engineering, Tsinghua University, Beijing China. His research interests include deep learning, including convolution neural networks, unsupervised learning algorithms, and optimization techniques.

    Dr. Hanif obtained his B.Sc. degree (March 2002 - Jan 2006) in Computer Engineering from Electrical and Computer Engineering Department, COMSATS Institute of Information Technology, Abbottabad, Pakistan. Soon after his graduation, he joined COMSATS Institute of Information Technology, Islamabad as a Lecturer. He completed his M.Sc. degree (Aug 2007 - Aug 2009) in Information Technology with specialization in Signal Processing from Department of Signal Processing, Faculty of Engineering Sciences at Tampere University of Technology, Tampere, Finland. His master thesis, which was part of a joint funded project from European Union (EU), focused on the digital image processing applications in paper making industry. In 2010, Dr. Hanif received TUBITAK fellowship from the Scientific and Technological Research Council of Turkey to work as visiting researcher with Dr. Gozde Unal on brain image registration at Engineering and Natural Sciences, Sabanci University, Istanbul (Turkey). Dr. Hanif pursued his PhD degree (Jun. 2011 - Jun 2015) in Image Processing from College of Engineering and Computer Science, Australian National University, Canberra, Australia. He was associated with Computer Vision and Robotics research group, National ICT Australia (NICTA). During his PhD studies, his research was mainly focused on blind image deconvolution and sparse image processing. Dr. Hanif received the European Research Consortium for Informatics and Mathematics (ERCIM) fellowship (Jan 2017 Nov 2019) for his postdoc at Italian National Research Council (CNR), Pisa, Italy. He was associated with Istituto di Scienza e Tecnologie dell Informazione A. Faedo, CNR. During his postdoc, he mainly worked on sparse signal representation-based approaches to address ancient manuscripts restoration, retrieval and classification. Currently he is assistant professor at the Faculty of Computer Science and Engineering, Ghulam Ishaq Khan Institute of Engineering Sciences and Technology, Topi, Pakistan.

    Chuangbai Xiao received the Ph.D. degree from Tsinghua University, Beijing, China, in 1995. Since 2001, he has been teaching and researching with the Faculty of Information Technology, Beijing University of Technology, Beijing, China, where he is currently a Professor. He has authored or coauthored over 100 papers in peer-reviewed journals, conferences, or workshops.

    M. Majid Butt received the M.Sc. degree in digital communications from Christian Albrechts University, Kiel, Germany, in 2005, and the Ph.D. degree in telecommunications from the Norwegian University of Science and Technology, Trondheim, Norway, in 2011. He is a Senior Scientist 5G+ Research at Nokia Bell Labs, Paris-Saclay, France, and a Visiting Research Assistant Professor at Trinity College Dublin, Dublin, Ireland. Prior to that, he has held various positions at the University of Glasgow, Glasgow, U.K., Trinity College Dublin, Fraunhofer HHI, Berlin, Germany, and the University of Luxembourg, Luxembourg City, Luxembourg. His current research interests include communication techniques for wireless networks with a focus on radio resource allocation, scheduling algorithms, energy efficiency, and machine learning for RAN. He has authored more than 60 peer-reviewed conference and journal publications in these areas. Dr. Butt was a recipient of the Marie Curie Alain Bensoussan Post-Doctoral Fellowship from the European Research Consortium for Informatics and Mathematics (ERCIM). He has served as the Organizer/Chair for various technical workshops on various aspects of communication systems in conjunction with major IEEE conferences, including Wireless Communications and Networking Conference, Globecom, and Greencom. He has been an Associate Editor for the IEEE Access and the IEEE Communication Magazine since 2016.

    Chin-Chen Chang (F 98) received the B.Sc. degree in applied mathematics and the M.Sc. degree in computer and decision sciences from National Tsing Hua University, Hsinchu, Taiwan, and the Ph.D.degree in computer engineering from National Chiao Tung University, Hsinchu, Taiwan. He has been with National Chung Cheng University, Chiayi County, Taiwan, from 1989 to 2005. He has been an Associate Professor with National Chiao Tung University, a Professor with National Chung Hsing University, Taichung, Taiwan, and a Chair Professor with National Chung Cheng University. He had also been a Visiting Researcher and a Visiting Scientist with Tokyo University, Tokyo, Japan, and Kyoto University, Kyoto, Japan. He has been a Chair Professor with the Department of Information Engineering and Computer Science, Feng Chia University, Taichung, Taiwan, since 2005. He has also served as the Chairman of the Institute of Computer Science and Information Engineering, the Dean of the College of Engineering, a Provost and the Acting President of National Chung Cheng University, and the Director of Advisory Office with the Ministry of Education, Taipei, Taiwan. He was invited to serve as a Visiting Professor, a Chair Professor, an Honorary Professor, an Honorary Director, an Honorary Chairman, a Distinguished Alumnus, a Distinguished Researcher, and a Research Fellow with universities and research institutes. His research interests include database design, computer cryptography, image compression, and data structures. Dr. Chang is currently a Fellow of the IEE, U.K. He was the recipient of several research awards and honorary positions in prestigious organizations, both nationally and internationally.

    View full text