Abstract
Many earlier reported works have shown that data pre-fetching can be an efficient answer to the well-known memory stalls. If one can reduce these stalls, it leads to performance improvement in terms of overall execution time for a given application. In this paper we propose a new n-gram model for prediction, which is based on dynamic pre-fetcher, in which we compute conditional probabilities of the stride sequences of previous n steps. Here n is an integer which indicates data elements. The strides that are already pre-fetched are preserved so that we can ignore them if the same stride number is referenced by the program due to principle of locality of reference, with the fact that it is available in the memory, hence we need not pre-fetch it. The model also gives the best probable and least probable stride sequences, this information can further be used for dynamic prediction. Experimental results show that the proposed model is far efficient and presents user certain additional input about the behavior of the application. The model flushes once number of miss-predictions exceed pre-determined limit. One can improve the performance of the existing compiler based Software Distributed Shared Memory (SDSM) systems using this model.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Beyls, K., D’Hollander, E.: Compile-time cache hint generation for epic architectures. In: Proceedings of the 2nd workshop on Explicitly Parallel Instruction Computing Architectures and Compiler Techniques (November 2002)
Beyler, J.C., Clauss, P.: ESODYP: An entirely software and dynamic data prefetcher based on a Markov model. In: Proceedings of the 12th Workshop on Compilers for Parallel Computers, A Coruna, Spain, pp. 118–132 (January 2006)
Brown, P.F., DeSouza, P.V., Mercer, R.L., Della Pietra, V.J., Lai, J.C.: Class-Based n-gram Models of Natural Language. Journal of Computational Linguistic Archive 18(4) (December 1992)
Veldema, R., Bhoedjang, R.A.F., Bal, H.E.: JACKAL, A compiler based Implementation of Java for cluster of workstations. In: Proceedings of SIGPLAN’s Principles and Practices of Parallel Computing, PPoPP 2001 (2001)
Klemm, M., Beyler, J.C., Lampert, R.T., Philippsen, M., Clauss, P.: Esodyp+: Prefetching in the Jackal Software DSM. In: Kermarrec, A.-M., Bougé, L., Priol, T. (eds.) Euro-Par 2007. LNCS, vol. 4641, pp. 563–573. Springer, Heidelberg (2007)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ramisetti, S., Wankar, R., Rao, C.R. (2012). Design of n-Gram Based Dynamic Pre-fetching for DSM. In: Xiang, Y., Stojmenovic, I., Apduhan, B.O., Wang, G., Nakano, K., Zomaya, A. (eds) Algorithms and Architectures for Parallel Processing. ICA3PP 2012. Lecture Notes in Computer Science, vol 7440. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33065-0_23
Download citation
DOI: https://doi.org/10.1007/978-3-642-33065-0_23
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-33064-3
Online ISBN: 978-3-642-33065-0
eBook Packages: Computer ScienceComputer Science (R0)