Skip to main content
Log in

A Graph-Based Recommendation Algorithm on Quaternion Algebra

  • Original Research
  • Published:
SN Computer Science Aims and scope Submit manuscript

Abstract

This study presents a novel Quaternion-based link prediction method to be used in different recommendation systems. The method performs Quaternion algebra-based computations while making use of expressive and wide-ranged learning properties of the Hamilton products. The proposed key capabilities rely on link prediction to boost performance in top-N recommendation tasks. According to the achieved experimental results, the proposed method allows for highly improved performance according to three quality measurements: (i) hits rate, (ii) coverage, and (iii) novelty; when applied to two datasets, namely the Movielens and Hetrec datasets. To assess the flexibility level of the proposed algorithm in terms of incorporating alternative sources of information, further wide-scale tests are carried out on three subsets of the Amazon dataset. Hence, the effectiveness of Quaternion algebra in graph-based recommendation algorithms is verified. The algorithms suggested here are further enhanced using similarity and dissimilarity factors between users and items, as well as ‘like’ and ‘dislike’ relationships between users and items. It is observed that this approach is adaptable by incorporating different information sources and can successfully overcome the drawbacks of conventional graph-based recommender systems. It is argued that the proposed novel idea of Quaternion-based link prediction method stands as a superior alternative to existing methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Yuan X, Huang JJ. An adaptive method for the tag-rating-based recommender system. In: AMT 12 international conference on active media technology. Berlin: Springer; 2012. p. 206–14.

    Chapter  Google Scholar 

  2. Zhang Y, Ai Q, Chen X, Croft WB. Joint representation learning for top-n recommendation with heterogeneous information sources. In: CIKM’17, 26th Conference on Information and Knowledge Management, ACM, 2017, pp. 1449–58.

  3. Kurt Z, Gerek ON, Bilge A, Özkan K. A multi source graph-based hybrid recommendation algorithm. In: Lecture notes on data engineering and communications technologies (trends in data engineering methods for intelligent systems). Berlin: Springer; 2020. p. 280–91.

    Google Scholar 

  4. Zhang S, Yao L, Tran LV, Zhang A, Tay Y. Quaternion collaborative filtering for recommendation. 2019. arXiv preprint, arXiv:1906.02594.

  5. Hayashi K, Shimbo M. On the equivalence of holographic and complex embeddings for link prediction. 2017. arXiv preprint, arXiv:1702.05563.

  6. Danihelk I, Wayne G, Uria B, Kalchbrenner N, Graves A. Associative long short-term memory. 2016. arXiv preprint, arXiv:1602.03032.

  7. Trouillon T, Welbl J, Riedel S, Gaussier É, Bouchard G. Complex embeddings for simple link prediction. In: ICML’16, International Conference on Machine Learning, 2016.

  8. Tay Y, Luu AT, Hui SC. Hermitian co-attention networks for text matching in asymmetrical domains. In: IJCAI’18, 27th International Joint Conference on Artificial Intelligence, 2018, pp. 4425–31.

  9. Trabelsi C, Bilaniuk O, Zhang Y, Serdyuk D, Subramanian S, Santos JF, Pal CJ. Deep complex networks. 2017. arXiv preprint, arXiv: 170509792.

  10. Witten B, Shragge J. Quaternion-based signal processing. In: SEG technical program expanded abstracts, Society of Exploration Geophysicists, 2006, pp. 2862–66.

  11. Parcollet T, Morchid M, Linarès G. Quaternion convolutional neural networks for heterogeneous image processing. In: ICASSP’19, International Conference on Acoustics, Speech and Signal Processing, IEEE, 2019, pp. 8514–18.

  12. Gaudet CJ, Maida AS. Deep quaternion networks. In: IJCNN’18, International Joint Conference on Neural Networks, IEEE, 2019, pp. 1–8.

  13. Greenblatt AB, Agaian SS. Introducing quaternion multi-valued neural networks with numerical examples. Inf Sci. 2018;423:326–42.

    Article  MathSciNet  Google Scholar 

  14. Saoud LS, Ghorbani R, Rahmoune F. Cognitive quaternion valued neural network and some applications. Neurocomputing. 2017;221:85–93.

    Article  Google Scholar 

  15. Du Y, Xu C, Tao D. Privileged matrix factorization for collaborative filtering. In: IJCAI’17, 26th International Joint Conference on Artificial Intelligence, Melbourne, Australia August 19–25, 2017, pp. 1610–16.

  16. Wang Z, Tan Y, Zhang M. Graph-based recommendation on social networks. In: 12th International Asia-Pacific Web Conference, IEEE, 2010, pp. 116–122.

  17. Adomavicius G, Tuzhilin A. Toward the next generation of recommender systems: a survey of the state-of-the-art and possible extensions. IEEE Trans Knowl Data Eng. 2005;17(6):734–49.

    Article  Google Scholar 

  18. Herlocker JL, Konstan JA, Terveen LG, Riedl JT. Evaluating collaborative filtering recommender systems. ACM Trans Inf Syst. 2004;22(1):5–53.

    Article  Google Scholar 

  19. McNee SM, Riedl J, Konstan JA. Being accurate is not enough: how accuracy metrics have hurt recommender systems. In: CHI'06, Conference on Human Factors in Computing Systems, Montreal, Canada, 2006, pp. 1097–101.

  20. Cacheda F, Carneiro V, Fernández D, Formoso V. Comparison of collaborative filtering algorithms: Limitations of current techniques and proposals for scalable, high-performance recommender systems. ACM Trans Web. 2011;5(1):1–33.

    Article  Google Scholar 

  21. Kurt Z, Ozkan K, Bilge A, Gerek ON. A similarity-inclusive link prediction based recommender system approach. Elektronika IR Elektrotechnika. 2019;25(6):62–9.

    Article  Google Scholar 

  22. Xie F, Chen Z, Shang J, Feng X, Li J. A link prediction approach for item recommendation with complex number. Knowl-Based Syst. 2015;81:148–58.

    Article  Google Scholar 

  23. Mishchenko A, Solovyov Y. Quaternions. Quantum. 2000;11:4–7.

    Google Scholar 

  24. Kunegis J, Gröner G, Gottron T. Online dating recommender systems: the split-complex number approach. In: RSWeb’12, 4th ACM Recsys Workshop on Recommender Systems and the Social Web, ACM 2012, pp. 37–44.

  25. Harary F. On the notion of balance of a signed graph. Mich Math J. 1955;2:143–6.

    MathSciNet  MATH  Google Scholar 

  26. Harary F, Palmer EM. On the number of balanced signed graphs. Bull Math Biophys. 1967;29(4):759–65.

    Article  Google Scholar 

  27. Kurt Z, Gerek ÖN, Bilge A, Özkan K. Similarity-inclusive link prediction with Quaternions. Int Conf Enterp Inf Syst. 2021;1:842–54.

    Google Scholar 

  28. Bedi P, Gautam A, Bansal S, Bhatia D. Weighted bipartite graph model for recommender system using entropy based similarity measure. In: ISTA’17, 2nd international symposium on intelligent systems technologies and applications. Cham: Springer; 2017. p. 163–73.

    Google Scholar 

  29. http://jmcauley.ucsd.edu/data/amazon/. Accessed 18 May 2022.

  30. Kaminskas M, Bridge D. Diversity, serendipity, novelty, and coverage: a survey and empirical analysis of beyond-accuracy objectives in recommender systems. ACM Trans Interact Intell Syst. 2017;7(1):2.

    Article  Google Scholar 

  31. http://grouplens.org/datasets/movielens/100k/. Accessed 18 May 2022.

  32. https://grouplens.org/datasets/hetrec-2011/. Accessed 18 May 2022.

  33. Huang Z, Chung W, Ong TH, Chen H. A graph-based recommender system for digital library. In: JCDL’02, 2nd ACM/IEEE-CS joint Conference on Digital libraries, ACM, Oregon, USA, 2002, pp. 65–73.

  34. Quaternion toolbox for Matlab, http://qtfm.sourceforge.net/. Accessed 18 May 2022.

  35. Saff EB, Snider AD. Fundamentals of matrix analysis with applications. New Jersey: John Wiley & Sons; 2015.

    MATH  Google Scholar 

Download references

Funding

This study was not funded by any company.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zuhal Kurt.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article is part of the topical collection “Enterprise Information Systems” guest edited by Michal Smialek, Slimane Hammoudi, Alexander Brodsky and Joaquim Filipe.

Appendix

Appendix

Theorem 1.

Let \({\varvec{A}}\) be a square matrix with size \(n\) by \(n\), and let \(\lambda = {(}\lambda_{1} ,\lambda_{2} ,...,\lambda_{n} {)}\) be all the eigenvalues of \({\varvec{A}}\). For each positive integer \(k\), the \(k^{th}\) powers of these values, which are \(\lambda^{k} = {(}\lambda_{1}^{k} ,\lambda_{2}^{k} ,...,\lambda_{n}^{k} {)}\), correspond to the eigenvalues of \({\varvec{A}}^{k}\), [35].

Proof 1.

By the triangularization (or Jordan canonical form), there exists a nonsingular matrix \({\varvec{S}}\) such that;

$${\varvec{S}}^{{{\varvec{ - 1}}}} {\varvec{AS}} = \left[ {\begin{array}{*{20}l} {\lambda_{1} } \hfill & * \hfill & * \hfill & * \hfill & * \hfill \\ 0 \hfill & {\lambda_{2} } \hfill & * \hfill & * \hfill & * \hfill \\ \vdots \hfill & \cdots \hfill & \ddots \hfill & \ldots \hfill & \vdots \hfill \\ 0 \hfill & 0 \hfill & 0 \hfill & {\lambda_{n - 1} } \hfill & * \hfill \\ 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill & {\lambda_{n} } \hfill \\ \end{array} } \right].$$

The result is an upper triangular matrix whose diagonal entries are eigenvalues of \({\varvec{A}}\). After the \(k^{th}\) power of the Jordan canonical form are taken, then we have

$${\varvec{S}}^{{{\varvec{ - 1}}}} {\varvec{A}}^{k} {\varvec{S}} = ({\varvec{S}}^{{{\varvec{ - 1}}}} {\varvec{AS}})^{k} = \left[ {\begin{array}{*{20}l} {\lambda_{1}^{k} } \hfill & * \hfill & * \hfill & * \hfill & * \hfill \\ 0 \hfill & {\lambda_{2}^{k} } \hfill & * \hfill & * \hfill & * \hfill \\ \vdots \hfill & \cdots \hfill & \ddots \hfill & \ldots \hfill & \vdots \hfill \\ 0 \hfill & 0 \hfill & 0 \hfill & {\lambda_{n - 1}^{k} } \hfill & * \hfill \\ 0 \hfill & 0 \hfill & 0 \hfill & 0 \hfill & {\lambda_{n}^{k} } \hfill \\ \end{array} } \right].$$

The characteristic polynomial of the matrix \({\varvec{A}}^{k}\) is given by;

$$\begin{gathered} f{(}\lambda ) = \det {(}{\varvec{A}}^{k} - \lambda {\varvec{I}}{)} \hfill \\ = \det {(}{\varvec{S}}^{ - 1} )\det {(}{\varvec{A}}^{k} - \lambda {\varvec{I}}{)}\det {(}{\varvec{S}}{)} \hfill \\ = \det {(}{\varvec{S}}^{ - 1} \cdot {(}{\varvec{A}}^{k} - \lambda {\varvec{I}}{)} \cdot {\varvec{S}}{)} \hfill \\ = \det {(}{\varvec{S}}^{ - 1} \cdot {\varvec{A}}^{k} \cdot {\varvec{S}} - \lambda {\varvec{I}}{)} \hfill \\ \end{gathered}$$
$$\left[ {\begin{array}{*{20}c} {\lambda_{1}^{k} - \lambda } & * & * & * & * \\ 0 & {\lambda_{2}^{k} - \lambda } & * & * & * \\ \vdots & \ldots & \ddots & \ldots & \vdots \\ 0 & 0 & 0 & {\lambda_{n - 1}^{k} - \lambda } & * \\ 0 & 0 & 0 & 0 & {\lambda_{n}^{k} - \lambda } \\ \end{array} } \right]$$
$$= \prod\limits_{i = 1}^{n} {(} \lambda_{i}^{k} - \lambda {)}$$
(A1)

Since the roots of the characteristic polynomial are all the eigenvalues as mentioned in Proposition 1, it can be seen from Eq. (A.1) that \(\lambda^{k} = {(}\lambda_{1}^{k} ,\lambda_{2}^{k} ,...,\lambda_{n}^{k} {)}\) are all the eigenvalues of \({\varvec{A}}^{k}\).

Hence, when the eigenvalue matrix of \({\varvec{A}}\) is denoted as \({{\varvec{\Lambda}}}\), and the eigenvector matrix of \({\varvec{A}}\) is denoted as \({\varvec{U}}\), then the \(k^{th}\) power of \({\varvec{A}}\), i.e. \({\varvec{A}}^{k}\) can be written as;

$${\varvec{A}}^{k} = {\varvec{U\Lambda }}^{k} {\varvec{U}}^{T}$$

Theorem 2.

If \({\varvec{A}}\) is a square matrix with eigenvalues.

\(\lambda_{i} ,i = 1,2,...,n\),

(a)\(\det {(}{\varvec{A}}{) = }\prod\limits_{i = 1}^{n} {\lambda_{i} }\)

(b)\({\text{trace}} {(}{\varvec{A}}{)} = \sum\limits_{i = 1}^{n} {\lambda_{i} }\)

Proof 2. (Part a)

Recall that eigenvalues are roots of the characteristic polynomial.

It follows that this polynomial corresponds to \(f_{{\varvec{A}}} {(}\lambda {)} = \det {(}{\varvec{A}} - \lambda {\varvec{I}}_{n} {)}\); and then

$$\begin{gathered} \det {(}{\varvec{A}} - \lambda {\varvec{I}}_{n} {)} = \left| {\begin{array}{*{20}c} {a_{11} - \lambda } & {a_{12} } & \cdots & {a_{1,n} } \\ {a_{21} } & {a_{22} - \lambda } & \cdots & {a_{2,n} } \\ \vdots & \vdots & \ddots & \vdots \\ {a_{n1} } & {a_{m2} } & \cdots & {a_{nn} - \lambda } \\ \end{array} } \right| \hfill \\ = \prod\limits_{i = 1}^{n} {{(}\lambda_{i} - \lambda {)}{\text{.}}} \hfill \\ \end{gathered}$$
(A2)

If \(\lambda\) is assigned a value as \(\lambda = 0\), it can be seen that

$$\det {(}{\varvec{A}}{) = }\prod\limits_{i = 1}^{n} {\lambda_{i} }$$

and this completes the proof of part (a).

(Part b) Let’s now compare the coefficients of \(\lambda^{n - 1}\) of both sides of Eq. (A.2). The coefficient of \(\lambda^{n - 1}\) of the determinant on the left side of Eq. (A.2) is, \(( - 1)^{n - 1} (a_{11} + a_{22} + \ldots + a_{nn} ) = ( - 1)^{n - 1} trace({\varvec{A}})\).

The coefficient of \(\lambda^{n - 1}\) of the determinant on the right side of Eq. (A.2) is

Thus, as the \({( - 1)}^{n - 1} \sum\nolimits_{i = 1}^{n} {\lambda_{i} }\) proof of part (b), we have: \({\text{trace}} {(}{\varvec{A}}{)} = \sum\nolimits_{i = 1}^{n} {\lambda_{i} }\).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kurt, Z., Gerek, Ö.N., Bilge, A. et al. A Graph-Based Recommendation Algorithm on Quaternion Algebra. SN COMPUT. SCI. 3, 299 (2022). https://doi.org/10.1007/s42979-022-01171-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s42979-022-01171-4

Keywords