Abstract:
Recent real-world measurements in dense congested radio environments have pointed out the inefficiency of frame error-based bit-rate adaptation mechanisms, which signific...View moreMetadata
Abstract:
Recent real-world measurements in dense congested radio environments have pointed out the inefficiency of frame error-based bit-rate adaptation mechanisms, which significantly reduce network capacity by misinterpreting frame errors due to collisions. These effects are likely to be amplified with the heavy use of media applications. Fortunately, traditional SNR-based rate adaptation, and the more recently proposed throughput-based, and collision-aware rate adaptation algorithms are expected to provide more robust performance in these scenarios. To our knowledge, however, their performance has never been experimentally validated in a congested environment. In this paper, we report our implementation experiences with rate adaptation in a dense, congested IEEE 802.11 network. We find that throughput-based adaptation, contrary to expectations, also suffers from poor bitrate selection. Due to an increase in physical layer capture, while using lower bitrates, nodes can increase their individual throughput at the expense of cumulative network throughput. SNR-based rate adaptation performs well in static environments but the lack of sufficient precision in RSSI measurements makes accurate rate selection in dynamic radio environments dfficult. The use of RTS/CTS, in the spirit of collision-aware rate adaptation, shows throughput improvements for frame error-based algorithms and, additionally, for throughput-based algorithms as well. However, results are below expectations likely due to RTS/CTS implementation issues on the Atheros 5212 platform.
Published in: 2007 IEEE International Symposium on a World of Wireless, Mobile and Multimedia Networks
Date of Conference: 18-21 June 2007
Date Added to IEEE Xplore: 22 October 2007
ISBN Information: