Measuring the mean Web page size and its compression to limit latency and improve download time
Abstract
Web traffic is doubling every year, according to recent global studies. The user needs more information from Web sites and wants to spend as little time for downloading as possible. Simultaneously, more Internet bandwidth is needed and all ISPs are trying to build high bandwidth networks. This paper presents a case study that calculates the reduction of the time needed for a Web page to be fully downloaded and delivered to the user. Presents a way to calculate the reduction of data transfer, bandwidth resources and response time when the HTTP/1.1’s compressing feature is enabled (either in plain hypertext files or the text output of CGI programs or dynamically generated pages). Measurements are taken from five popular Web sites in order to validate our statement for reduction in transfer time. The definition of the mean size of a Web page that commercial Web sites have is additionally in the scope of this paper.
Keywords
Citation
Destounis, P., Garofalakis, J., Kappos, P. and Tzimas, J. (2001), "Measuring the mean Web page size and its compression to limit latency and improve download time", Internet Research, Vol. 11 No. 1, pp. 10-17. https://doi.org/10.1108/10662240110365661
Publisher
:MCB UP Ltd
Copyright © 2001, MCB UP Limited