skip to main content
10.1145/3539618.3591791acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
abstract

Neural Architectures for Searching Subgraph Structures

Published:18 July 2023Publication History

ABSTRACT

With the development of new neural network architectures for graph learning in recent years, the use of graphs to store, represent and process data become more trendy. Nowadays, graphs as a rich structured form of data representation are used in many real-world projects. In these projects, objects are often defined in terms of their connections to other things. There are many practical applications in areas such as antibacterial discovery, physics simulations, fake news detection, traffic prediction and recommendation systems.

While there are a growing number of neural graph representation learning techniques that allow one to learn effective graph representations [1-3, 5], they may not necessarily be appropriate for the task of searching over graphs for several reasons: (1) nodes within a graph consist of a set of attributes, which are the subject of the search process. However, the number of unique attributes on the graph compared to the number of attributes on each node is extremely sparse; therefore, this makes it very difficult to learn effective graph representations for such sparse information; (2) graph neural networks are capable of generating rich embedding representations for nodes and entities in graph however, depending on the downstream task, the embedding vectors may perform not well as expected. Therefore, researchers need to apply solutions such as custom loss functions or pre-training tasks to adapt mentioned architectures for their specific task. Search in the graph as a downstream task follows the same trend and therefore a tailored graph neural network representation is needed to particularly address the need for a rich embedding representation for the sole purpose of subgraph search.

My work focuses on searching for subgraph structures over both complete and incomplete heterogeneous graphs. The significance of my research direction lies in the fact that exact subgraph search is an NP-hard problem and as such existing methods are either accurate but impractically slow, or efficient yet suffering from low effectiveness. With a focus on learning robust neural representations for complete and incomplete graphs, my research focuses on developing search methods that are both effective and efficient. Specifically, my research addresses the following research questions: RQ1) Would it be possible to design and develop graph representation learning methods for heterogeneous graphs that can generate effective embedding vectors from a heterogeneous graph and support effective and efficient subgraph search? RQ2) Whether it would be possible to address the issue of graphs with varying degrees of missing values. Incomplete graphs suffer from missing attributes and/or missing edges. I explore the design of robust graph representation learning models capable of effectively searching in light of missing information. RQ3) Can efficient and effective derivations of my subgraph search methods be used to address practical applications in real-world domains such as team formation, and keyword search over knowledge graphs?

Research conducted as a part of my Ph.D. has so far focused on identifying and retrieving subgraphs over complete heterogeneous graphs. I have used the Team Formation problem as a case study in order to evaluate my work [4]. As the next step, I am planning to focus on expanding my research to include incomplete graphs and investigate methodologies to craft optimized graph representations for the specific task of searching on graphs.

References

  1. Aditya Grover and Jure Leskovec. 2016. node2vec: Scalable Feature Learning for Networks. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, August 13-17, 2016. ACM, 855--864. https://doi.org/10.1145/2939672.2939754Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. William L. Hamilton, Zhitao Ying, and Jure Leskovec. 2017. Inductive Representation Learning on Large Graphs. In Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, December 4-9, 2017, Long Beach, CA, USA, Isabelle Guyon, Ulrike von Luxburg, Samy Bengio, Hanna M. Wallach, Rob Fergus, S. V. N. Vishwanathan, and Roman Garnett (Eds.). 1024--1034. https://proceedings.neurips.cc/paper/2017/hash/5dd9db5e033da9c6fb5ba83c7a7ebea9-Abstract.htmlGoogle ScholarGoogle Scholar
  3. Thomas N. Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24-26, 2017, Conference Track Proceedings. OpenReview.net. https://openreview.net/forum?id=SJU4ayYglGoogle ScholarGoogle Scholar
  4. Radin Hamidi Rad, Ebrahim Bagheri, Mehdi Kargar, Divesh Srivastava, and Jaroslaw Szlichta. 2021. Retrieving Skill-Based Teams from Collaboration Networks. In SIGIR '21: The 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, Virtual Event, Canada, July 11-15, 2021, Fernando Diaz, Chirag Shah, Torsten Suel, Pablo Castells, Rosie Jones, and Tetsuya Sakai (Eds.). ACM, 2015--2019. https://doi.org/10.1145/3404835.3463105Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Petar Velickovic, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2018. Graph Attention Networks. In 6th International Conference on Learning Representations, ICLR 2018, Vancouver, BC, Canada, April 30 - May 3, 2018, Conference Track Proceedings. OpenReview.net. https://openreview.net/forum?id=rJXMpikCZGoogle ScholarGoogle Scholar

Index Terms

  1. Neural Architectures for Searching Subgraph Structures

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        SIGIR '23: Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval
        July 2023
        3567 pages
        ISBN:9781450394086
        DOI:10.1145/3539618

        Copyright © 2023 Owner/Author

        Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 18 July 2023

        Check for updates

        Qualifiers

        • abstract

        Acceptance Rates

        Overall Acceptance Rate792of3,983submissions,20%
      • Article Metrics

        • Downloads (Last 12 months)79
        • Downloads (Last 6 weeks)3

        Other Metrics

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader