Abstract
A conventional quasi orthogonal space time block code (QO-STBC) scheme can achieve full rate, but at the cost of decoding complexity. This limitation of the conventional QO-STBC scheme is mainly due to interference terms in the detection matrix. In this article, a novel QO-STBC scheme is proposed which eliminates the interference terms. The proposed method achieves improved diversity as compared to the conventional QO-STBC scheme, also providing a considerable reduction in decoding complexity. A transmit antenna shuffling scheme for the proposed code is also illustrated. It is shown that by adaptively mapping space time sequences of the proposed code to appropriate transmit antennas depending on channel condition, proposed scheme can improve its transmit diversity with limited feedback information. Lastly, simulation results show that the symbol error rate performance is improved considerably.
Similar content being viewed by others
References
Alamouti, S. M. (1998). A simple transmitter diversity scheme for wireless communications. IEEE Journal on Selected Areas in Communications, 16, 1451–1458.
Tarokh, V., Jafarkhani, H., & Calderbank, A. R. (1999). Space-time block codes from orthogonal designs. IEEE Transactions on Information Theory, 45, 1456–1467.
Jafarkhani, H. (2001). A quasi-orthogonal space-time block code. IEEE Transactions on Communications, 49, 1–4.
Yu, Y., Kerouedan, S., & Yuan, J. (2006). Transmit antenna shuffling for quasi-orthogonal space-time block codes with linear receviers. IEEE Communications Letters, 10(8), 596.
Park, U., Lim, K., & Li, J. (2008). A novel QO-STBC scheme with linear decoding for three and four transmit antennas. IEEE Communications Letters, 12(12), 869.
Taha, Z. Q., & Farraj, A. K. (2013). Efficient decoding for generalized quasi-orthogonal space-time block codes. Wireless Personal Communications, 68, 1731–1743.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
Singular value decomposition is based on linear algebra which says that a rectangular matrix D can be broken down into the product of three matrices - orthogonal matrix U, a diagonal matrix S and the transpose of an orthogonal matrix W. The theorem is usually presented like this:
where \(\hbox {UU}^{\mathrm{T}}=\hbox {I}\), \(\hbox {WW}^{\mathrm{T}}=\hbox {I}\); the columns of U are orthogonal eigenvectors of \(\hbox {DD}^{\mathrm{T}}\), the columns of W are orthogonal eigenvectors of \(\hbox {D}^{\mathrm{T}}\hbox {D}\), and S is a diagonal matrix containing the square roots of eigen values from U or W in descending order. Start with the matrix
In order to find U, start with \(\hbox {DD}^{\mathrm{T}}\)
Now, to find the eigen values and corresponding eigenvectors of \(\hbox {DD}^{\mathrm{T}}\). It is known that eigenvectors are defined by the equation \(\hbox {A}\vec {v}=\lambda \vec {v}\), applying this to \(\hbox {DD}^{\mathrm{T}}\) gives us:
Rewrite above matrix as the set of equations
which are solved by setting determinant of coefficient matrix to zero,
which works out as
This gives eigen values \(\lambda =(a+b)^{2};\lambda =(a+b)^{2};\lambda =(a-b)^{2}\) and \(\lambda =(a-b)^{2}\). Plugging \(\lambda \) back to the original equations gives eigenvectors.
For \(\lambda =(a+b)^{2}\),
By solving these equation \(x_1=x_4\) and \(x_2=-x_3\) is obtained. Thus, corresponding to eigen value \(\lambda =(a+b)^{2}\), eigenvectors are [1 1 \(-\)1 1] and [1 \(-\)1 1 1].
For \( \lambda =(a-b)^{2}\),
By solving these equation \(x_1=-x_4\) and \(x_2=x_3\) is obtained. Thus, corresponding to eigen value \(\lambda =(a-b)^{2}\), eigenvectors are [1 1 1 \(-\)1] and [\(-\)1 1 1 1].
These eigenvectors become column vectors in a matrix ordered by the size of the corresponding eigen value. In the matrix below, the eigenvectors for \(\lambda =(a+b)^{2}\) are in column one and two, and, the eigenvectors for \(\lambda =(a-b)^{2}\) are in column three and four.
Finally, by applying Gram-Schmidt orthonormalization process to the column vector, above matrix is converted into an orthogonal matrix. Begin by normalizing
For
For
For
For
All these \(\vec {u}\) ‘s result in an orthogonal matrix U which is given by
In order to find W, start with \(\hbox {D}^{\mathrm{T}}\hbox {D}\)
As D is the symmetric matrix, so \(\hbox {D}^{\mathrm{T}}\hbox {D}=\hbox {DD}^{\mathrm{T}}\). This means that value of W will be equal to the value of U which was derived earlier by \(\hbox {DD}^{\mathrm{T}}\). So, \(\hbox {W}^{\mathrm{T}}\) is given by
For finding S, take square roots of the non-zero eigen values and populate the diagonal with them, putting largest first. So, S is given by
So, \(\hbox {D}=\hbox {USW}^{\mathrm{T}}\) is given by
Rights and permissions
About this article
Cite this article
Sharma, V., Sharma, S. Novel Linear Decodable QO-STBC for Four Transmit Antennas with Transmit Antenna Shuffling. Wireless Pers Commun 82, 47–59 (2015). https://doi.org/10.1007/s11277-014-2192-2
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11277-014-2192-2