Abstract:
Leveraging various sensors, i.e., cameras and Li-DARs, is deemed as a promising way to realize fast millimeter wave (mmWave) beam alignment with no frequency overhead. Ho...Show MoreMetadata
Abstract:
Leveraging various sensors, i.e., cameras and Li-DARs, is deemed as a promising way to realize fast millimeter wave (mmWave) beam alignment with no frequency overhead. However, previous methods that place extra sensors at base station (BS) and mobile station (MS) may augment communication system expenses and also induce privacy concerns. In this paper, we propose a novel beam alignment framework that utilizes images taken by camera placed at third-party perspective. We design a deep neural network that extracts and fuses the position and orientation of mobile user in the third-party images to infer the optimal beam pair, and we establish real-world datasets to train and evaluate the proposed network. Specifically, we design a mmWave light-weight beam sweeping system to reduce time expenses and system complexity during dataset collection. To calculate the time consumption of the third-party camera aided beam alignment framework, we independently implement a mmWave communication prototype with 410MHz bandwidth OFDM baseband according to IEEE 802.11 protocol on Xilinx RFSoC. The experimental results show the third-party camera aided beam alignment approach achieves over 98% in top-5 accuracy, and reduces the mmWave beam alignment time consumption to below 1/50 of that incurred by exhaustive beam sweeping.
Date of Conference: 21-24 April 2024
Date Added to IEEE Xplore: 03 July 2024
ISBN Information: