Loading [MathJax]/extensions/MathMenu.js
A Deep Learning Method for Sentence Embeddings Based on Hadamard Matrix Encodings | IEEE Conference Publication | IEEE Xplore

A Deep Learning Method for Sentence Embeddings Based on Hadamard Matrix Encodings


Abstract:

Sentence Embedding is recently getting an accrued attention from the Natural Language Processing (NLP) community. An embedding maps a sentence to a vector of real numbers...Show More

Abstract:

Sentence Embedding is recently getting an accrued attention from the Natural Language Processing (NLP) community. An embedding maps a sentence to a vector of real numbers with applications to similarity and inference tasks. Our method uses: word embeddings, dependency parsing, Hadamard matrix with spread spectrum algorithm and a deep learning neural network trained on the Sentences Involving Compositional Knowledge (SICK) corpus. The dependency parsing labels are associated with rows in a Hadamard matrix. Words embeddings are stored at corresponding rows in another matrix. Using the spread spectrum encoding algorithm the two matrices are combined into a single unidimensional vector. This embedding is then fed to a neural network achieving 80% accuracy while the best score from the SEMEVAL 2014 competition is 84%. The advantages of this method stem from encoding of any sentence size, using only fully connected neural networks, tacking into account the word order and handling long range word dependencies.
Date of Conference: 25-28 May 2022
Date Added to IEEE Xplore: 25 October 2022
ISBN Information:
Conference Location: Timisoara, Romania

Contact IEEE to Subscribe

References

References is not available for this document.