Abstract:
Each hidden layer neuron in a multi-layer perceptron neural network comprises of synaptic weights, an adder and an activation function. The number of synaptic weights req...Show MoreMetadata
Abstract:
Each hidden layer neuron in a multi-layer perceptron neural network comprises of synaptic weights, an adder and an activation function. The number of synaptic weights required per neuron is application specific and our contribution is a neuron implementation which is tailored to suit utilization in the complex baseband predistortion of a class-AB power amplifier given a wideband stimulus. The wideband or dynamic nature of the input calls for a neuron with ten synaptic weights as obtained through behavioral training and measurement. The performance of our neuron implementation is verified by measuring gain control, linearity, and bandwidth of the synaptic weights as well as the non-linear activation function. Finally a comparison with previously implemented neurons (both analog and digital) in terms of bandwidth, power consumption and linearity is done.
Date of Conference: 12-17 July 2015
Date Added to IEEE Xplore: 01 October 2015
ISBN Information: