A context-aware cache structure for mobile computing environments

https://doi.org/10.1016/j.jss.2006.10.027Get rights and content

Abstract

This paper proposes a cache management method that maintains a mobile terminal’s cache content by prefetching data items with maximum benefit and evicting cache data entries with minimum benefit. The data item benefit is evaluated based on the user’s query context which is defined as a set of constraints (predicates) that define both the movement pattern and the information context requested by the mobile user. A context-aware cache is formed and maintained using a set of neighboring locations (called the prime list) that are restricted by the validity of the data fetched from the server. Simulation results show that the proposed strategy, using different levels of granularity, can greatly improve system performance in terms of the cache hit ratio.

Introduction

Data caching as a means to achieve higher performance has been a perpetual quest in the computer industry. Most recently and due to the significantly increased size of memory available for small devices, caches are used as small data base systems that hold content most likely to be used in the near future (this is called prefetching or hoarding). However, due to the size limitation of the cache, cache management strategies exist to efficiently manage the cache content. The location information is used as a key field of the user’s query context, but not enough attention has been given to the other query fields (predicates) which define the user’s information context. Emerging location-based service (LBS) providers use the location information of mobile users to provide them with relevant information based on their geographical positions. Information disseminated to mobile users potentially can be context-sensitive and highly personalized. Therefore, an effective cache management scheme needs to adapt dynamically to the user’s query context. Additionally, both the cached data items and the prefetched ones should be determined and adjusted according to the user’s movement pattern and information context.

In evaluating the data item’s benefit as far as the cache content is concerned, we propose a scheme that uses the query context as an information filtering mechanism to limit the amount of prefetched information to the data items with maximum benefit. A main aspect of this work involves predicting the future context that will be required by the user. In some situations forecasting may be impossible, but in situations where the content is changing gradually and continuously i.e., in continuous type of queries, this may be possible and very effective. Forecasting may be done, for example, by analyzing the user’s current query context. The purpose of trying to predict future contexts is to anticipate the user’s future retrieval needs, and to perform retrievals in advance of the need. Assuming the prediction is correct, the response to retrieval requests will then be very fast, since the necessary retrieval will have been done in advance. When a cache-miss happens, the mobile terminal (MT) asks for several other items and not just the cache-missed data item, with little additional cost. This action will prevent future cache misses and will reduce the number of uplink requests.

A mobility-based semantic cache structure and query processing was first proposed in Dar et al. (1996). Ren and Dunham, 2000, Ren et al., 2003 have extended this work to use the location information attached to each segment, making it more efficient and they have also proposed a cache management replacement policy. This work is based on this previous body of research on semantic cache management, however it focuses on the cache management prefetching strategy. To design an effective cache management strategy we consider the neighboring cells with valid information and the current query information context factors. Based on these two factors, we outline the major contributions of this paper below:

  • The key contribution of this study is a context-aware (CA) cache management prefetching strategy, which first uses the validity of the data (valid scope distribution) based on their location to derive a set of most likely future cells called the “prime” list of cells. Next, in order to identify data items with high benefit as far as the cache content is concerned the user’s query context is exploited to limit the amount of prefetched information within the predicted set of future cells (the prefetching zone).

  • A direct result of the proposed strategy is the formation and maintenance of the context-aware cache of data items with a high cache value which are included at a low cost. The context-aware cache is then updated if the mobile user subsequently strays out of the predefined prime list of cells.

  • The performance of the cache management strategy is examined through a number of experiments which show that the context-aware prefetching produces significant improvements over the standard direction velocity based prefetching strategies.

The rest of the paper is organized as follows: Section 2 gives a brief description of location-dependent data caching and prefetching strategies-related research. Section 3 presents the mobile architecture together with the mobility and query models. This section also explains the semantic cache description at the mobile client. Section 4 presents the proposed cache management strategy and explains its associated components, i.e. the presentation of the query-context prefetching method and prefetching algorithm. Section 5 presents a prefetching cost analysis. Section 6 describes the simulation model and performance comparison is discussed in Section 7. Finally, Section 8 provides conclusive remarks and future plans regarding the presented research.

Section snippets

Previous work

Research on cache management has been active over the past few years. The least recently used (LRU) replacement policy, which evicts the object that has not been accessed for the longest time, works well when the most recently referenced objects are most likely to be referenced again in the near future. Ren and Dunham (2000) have proposed a mobility model to represent the moving behavior of mobile users and have formally defined location-dependent queries. Based on their mobility model, they

Mobile architecture

In a mobile architecture (Fig. 1), the geographical coverage area for the information service is partitioned into service areas, with each service area attached to a data server. The service area may cover one or multiple cells. Each service area is associated with a service_id for identification purposes. This id is broadcasted periodically to all the mobile clients in that service area. The database associated with each service area is a collection of data items. Every data server keeps a

The cache prefetching components

Prefetching is a technique that is mainly concerned with improving the system performance. Caching alone is generally not enough to improve performance of mobile systems. Moreover, prefetching has a broader application range than simply storing already used data in a cache. Prefetching together with replacement are used to support cache management. To avoid excessive network traffic and prefetching cycles, the prefetching mechanism has to consider different strategies to increase the efficiency

Prefetching cost analysis model

In a mobile wireless environment, the most scarce and critical resources are those strictly connected to the use of portable devices to access the information service, such as disk space, processing power, wireless link bandwidth and amount of energy. In this paper, we focus our attention on the bandwidth and define it as the cost of prefetching the consumption of this scarce resource. As a measure of bandwidth occupancy, we consider the number of bytes per second (W) that traverse in both

The environment

In our simulation, we model a wireless infrastructure based network. The simulated area in which mobile user’s movement takes places, is described using a N × N rectangular area of cells (Fig. 7), represented by fixed size hexagons (other shapes may also be used). The same cell array was adopted in previous works (Ho and Akyildiz, 1995, Akyildiz et al., 1996, Hac and Zhou, 1997). To set the simulation environment, we investigated both using the network simulator (ns-2) (The Network Simulator, 2002

Performance evaluation

In this section, the proposed location and context aware cache management strategy (CA) is evaluated and compared with the standard directional scheme based on tangent velocity (Park et al., 2004) (denoted as non-CA or DR). The cache hit ratio (h) is used as the primary performance metric, because most of the other performance results can be derived from the cache hit ratio. The cache hit ratio is the percentage of all the requests that can be satisfied by searching the cache for a copy of the

Conclusions and future work

In a mobile computing paradigm, caching alone is generally not enough to improve the performance of mobile systems. Moreover, prefetching has a broader application range than simply storing already used data in the cache. In this paper, we have considered both the movement pattern and the query pattern. Noticing that the query has valid answers only within a set of cells predefined for each data item (item valid scope), we first proposed an efficient future cell prediction filtering mechanism

Acknowledgements

The authors would like to acknowledge Carol Choi and Phong Luu for their programming support in the simulation part of this work under NSF Grant Nos. 212400512 and 212400513 and Dr. Zuji Mao (Lucent technologies) for his help with the formulation of the main concepts of this research. Also, the authors are very grateful to the anonymous reviewers for their comments and critique which improved the quality of this paper significantly.

Stylianos Drakatos has more than 15 years of experience (Including Big Five) in the field of Telecommunications and Information Technology. With Bell South, he participated in the design and implementation of a number of technology integration projects, as a member of the planning and re-engineering team. He also has worked with several other important companies such as Price Waterhouse in New York City as Principal Consultant and Florida Power & Light Co. in Miami, Florida. He is currently a

References (25)

  • S. Park et al.

    Improving prediction level of prefetching for location-aware mobile information service

    Future Generations Computer Systems

    (2004)
  • I.F. Akyildiz et al.

    The predictive user mobility profile framework for wireless multimedia networks

    IEEE/ACM Transactions on Networking

    (2004)
  • I.F. Akyildiz et al.

    Movement-based location update and selective pagings for PCS networks

    IEEE/ACM Transactions on Networking

    (1996)
  • A. Aljadhai et al.

    Predictive mobility support for QoS, provisioning in mobile wireless environments

    IEEE Journal on Selected Areas in Communications

    (2001)
  • S. Dar et al.

    Semantic data caching and replacement

  • A. Datta et al.

    World wide wait: a study of internet scalability and cache-based approaches to alleviate it

    Management Science

    (2003)
  • Drakatos, S., Pissinou, N., Makki, K., Douligeris, C., 2006. A future location aware prediction replacement strategy...
  • Duhham, M.H., Kumar, V., 1998. Location dependent data and its management in mobile computing. In: 9th International...
  • A. Hac et al.

    Locating strategies for personal communication networks: a novel tracking strategy

    IEEE JSAC

    (1997)
  • J.S.M. Ho et al.

    Mobile user location update and paging under delay constraints

    ACM-Baltzer Journal of Wireless Networks

    (1995)
  • D. Lam et al.

    Teletraffic modeling for personal communications services

    IEEE Communications Magazine

    (1997)
  • D.L. Lee et al.

    Data management in location-dependent information services

    IEEE Pervasive Computing

    (2002)
  • Cited by (13)

    • Federated broker system for pervasive context provisioning

      2013, Journal of Systems and Software
      Citation Excerpt :

      Caching related issues, specifically the effect of variable query rate, non-uniform scope selection, etc. on the cache hit ratio, are evaluated in detail in our earlier work (Kiani et al., 2010a). There remains further scope for research in this particular avenue, e.g. considering the patterns and contents of user queries (Drakatos et al., 2007), storage capabilities of mobile devices while utilising caches for performance optimisation and energy conservation (Mavromoustakis and Karatza, 2007). The mean query satisfaction times for individual subscriptions with varying query rates λ are plotted in Fig. 11(a).

    • Weighted sequential pattern mining algorithm research based on well completion business process

      2015, Proceedings of the World Congress on Intelligent Control and Automation (WCICA)
    • Diagnosis of diesel engine wear condition based on expert system

      2015, Journal of the Balkan Tribological Association
    View all citing articles on Scopus

    Stylianos Drakatos has more than 15 years of experience (Including Big Five) in the field of Telecommunications and Information Technology. With Bell South, he participated in the design and implementation of a number of technology integration projects, as a member of the planning and re-engineering team. He also has worked with several other important companies such as Price Waterhouse in New York City as Principal Consultant and Florida Power & Light Co. in Miami, Florida. He is currently a lecturer at the Decision Sciences and Information Systems department in the College of Business at the Florida International University.

    Niki Pissinou received her Ph.D. in Computer Science from the University of Southern California, her M.Sc. in Computer Science from the University of California at Riverside, and her B.S.I.S.E. in industrial and Systems Engineering from the Ohio State University. She is currently a tenured professor and the director of the Telecommunications & Information Technology Institute at the Florida International University in Miami, Florida. Previously, she was tenured faculty at the Center for Advanced Computer Studies at the University of Louisiana at Lafayette where she was also the director of the Telecommunication & Information Technology Laboratory partially funded by NASA, and co-director of the NOMAD: A wireless and Nomadic Laboratory partially funded by NSF, and the advanced Network Laboratory. Dr. Pissinou is active in the fields of computer networks, information technology and distributed systems. Dr. Pissinou has published more than 100 refereed publications and has received best paper awards. She has also co-edited eight volumes and is the author of an upcoming book on wireless Internet computing. She has received extensive funding for her research work from agencies such as NSF, NASA, DAPRA and ARO including two recent NSF awards on Wireless Networks.

    Kia Makki received his Ph.D. in computer science from the University of California, Davis, and two M.S. degrees from The Ohio State Universities and West Coast University. He is currently a tenured professor with an endowed chair professorship and an Associate Dean of the College of Engineering at the Florida International University. Previously he was tenured faculty with an O’Krepki Chair professorship at the University of Louisiana, Lafayette, the Director of the Center For Telecommunications Studies, and the Co-Director of the NOMAD and Advanced Computer Laboratory which is partially funded by NSF and NASA. His technical interests include wireless and optical telecommunications. He has numerous refereed publications and his work has been extensively cited in books and papers. Dr. Makki has been Associate Editor, Editorial Board Member, Guest Editor of several International Journals including Editorial Board Member of several telecommunication journals. Dr. Makki has been involved in numerous conferences as a Steering Committee Member, General Chair, Program Chair, Program Vice-Chair and Program Committee Member.

    Christos Douligeris received the diploma in electrical engineering from the National Technical University of Athens in 1984 and the M.T., M.Phil., and Ph.D. degrees from Columbia University in 1985, 1987, and 1990, respectively. He has held positions with the Department of Electrical and Computer Engineering at the University of Miami, where he reached the rank of associate professor and was the associate director for engineering at the Ocean Pollution Research Center. He is currently an associate professor in the Department of Informatics at the University of Piraeus, Greece. He has served on technical program committees of several conferences. His main technical interests lie in the areas of performance evaluation of high speed networks, security in networking, resource allocation in wireless networks, information management, risk assessment, and evaluation for emergency response operations. He is an editor of IEEE Communications Letters, Computer Networks, IEEE Network, a former technical editor of IEEE Communications; a senior member of the IEEE; and a member of the ACM and the Technical Chamber of Greece.

    View full text