Abstract:
We consider amplify-and-forward relay networks and expand upon prior work where relay networks were treated like neural networks. This approach recognizes that amplifier ...Show MoreMetadata
Abstract:
We consider amplify-and-forward relay networks and expand upon prior work where relay networks were treated like neural networks. This approach recognizes that amplifier saturation results in a non-linear behavior of each relay, which is similar to the behavior of a neuron. Thus, it allows the network to use the full range of the relay outputs, and for training the network using deep learning optimization (originally designed to train neural networks). This is in contrast to traditional methods that operate relays in their linear regime. Unlike previous works which considered the non-linearity only in real signals and square pulse shapes, in this work, we consider practical baseband signals and general pulse shapes. Using a complex network representation, we show that the similarity between relay networks and neural networks extends also to this more realistic setting and allows significant power savings. For general pulse shapes, we also show that significant gains are still achievable by training with multiple samples per symbol. These contributions help strengthen the idea of treating relay networks as neural networks by moving it closer to practicality. Thus, even without adding any neural network to the system, treating relays as neurons and using deep learning techniques yield significant gains.
Published in: 2024 IEEE 25th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC)
Date of Conference: 10-13 September 2024
Date Added to IEEE Xplore: 07 October 2024
ISBN Information:
Electronic ISSN: 1948-3252