Abstract:
Support vector regressors (SVRs) are a class of nonlinear regressor inspired by Vapnik's support vector (SV) method for pattern classification. The standard SVR has been ...Show MoreMetadata
Abstract:
Support vector regressors (SVRs) are a class of nonlinear regressor inspired by Vapnik's support vector (SV) method for pattern classification. The standard SVR has been successfully applied to real number regression problems such as financial prediction and weather forecasting. However in some applications the domain of the function to be estimated may be more naturally and efficiently expressed using complex numbers (eg. communications channels) or quaternions (eg. 3-dimensional geometrical problems). Since SVRs have previously been proven to be efficient and accurate regressors, the extension of this method to complex numbers and quaternions is of great interest. In the present paper the standard SVR method is extended to cover regression in complex numbers and quaternions. Our method differs from existing approaches in-so-far as the cost function applied in the output space is rotationally invariant, which is important as in most cases it is the magnitude of the error in the output which is important, not the angle. We demonstrate the practical usefulness of this new formulation by considering the problem of communications channel equalization.
Published in: 2007 International Joint Conference on Neural Networks
Date of Conference: 12-17 August 2007
Date Added to IEEE Xplore: 29 October 2007
ISBN Information: