Abstract
A multi-graph is modeled as a bag of graphs, whose mutual relationships can be used to enhance the accuracy of multi-graph classification. However, to the best of our knowledge, research on utilizing those mutual relationships has not been reported. In this paper, we propose a novel super-graph model \(SG=(MG,AG)\), where MG denotes a multi-graph and AG represents a graph (named abstract-graph), that describes the mutual relationships among the graphs contained in MG. The super-graph classification problem is challenging to solve because of the very complex structure of the super-graph model. Furthermore, it is hard to directly select distinguished subgraphs, i.e., subgraph features, from super-graphs. A subgraph g of graph G is a graph that is isomorphic with one of the substructures of G. Moreover, the practical applications require the super-graph classification algorithm to have high precision. In this paper, we propose a concept and algorithm for selecting composite subgraph features, based on which a framework is proposed to solve the super-graph classification problem. Subgraph features denote subgraphs that can be used to distinguish super-graphs with different class labels. We first design a two-step approach to select k composite subgraph features from the subgraphs of super-graphs’ abstract-graphs and multi-graphs. Then, based on composite features and the subgraph feature representation of a super-graph, each super-graph SG is transformed into a 0-1 vector with k dimensions. If there exists a substructure in SG that is isomorphic with its i th composite feature, the i th component of the target vector is set to 1 (1 ≤ i ≤ k). Otherwise, it is set to 0. Based on the derived k-dimensional vectors, one of the existing classification algorithms is used to construct a prediction model to predict the class labels of the unseen super-graphs, such as naive Bayes or support vector machine (SVM). Specifically, we adapt the extreme learning machine (ELM) algorithm to further improve the accuracy of super-graph classification. In this paper, we propose a super-graph model and study the problem of super-graph classification. We first derive the concept of composite subgraph features that are selected by our proposed two-step method. Based on the mined composite subgraph features, we propose a super-graph classification framework (SGC) to solve the super-graph classification problem. Moreover, ELM can be used to further improve the classification accuracy. Extensive experiments on real-world image datasets show that our algorithm based on ELM is more accurate than the baseline algorithms.









Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
References
Wu J, Hong Z, Pan S, et al. Multi-graph learning with positive and unlabeled bags. SDM; 2014. p. 217–25.
Wu J, Zhu X, Zhang C, et al. Bag constrained structure pattern mining for multi-graph classification. TKDE 2014;26(10):2382–96.
Wu J, Pan S, Zhu X, et al. Boosting for multi-graph classification. T Cybern 2015;45(3):430–43.
Achanta R, Shaji A, Smith K, et al. SLIC superpixels compared to state-of-the-art superpixel methods. IEEE Trans Pattern Anal Mach Intell 2012;34(11):2274–82.
Pang J, Gu Y, Xu J, et al. Parallel multi-graph classification using extreme learning machine and MapReduce. ELM; 2015. p. 77–92.
Oneto L, Bisio F, Cambria E. SLT-based ELM for big social data analysis. Cogn Comput 2017;9(2): 259–74.
Wang J, Ye K, Cao J, et al. DOA estimation of excavation devices with ELM and MUSIC-based hybrid algorithm. Cogn Comput 2017;9(4):564–80.
Zhang Z, Zhao X, Wang G. FE-ELM: a new friend recommendation model with extreme learning machine. Cogn Comput 2017;9(5):659–70.
Kong X, Yu P. Semi-supervised feature selection for graph classification. KDD; 2010. p. 793–802.
Yan X, Cheng H, Han J, et al. Mining significant graph patterns by leap search. SIGMOD; 2008. p. 433–44.
Ranu S, Singh A. Graphsig: a scalable approach to mining significant subgraphs in large graph databases. ICDE; 2009 . p. 844–55.
Jin N, Wang W. LTS: discriminative subgraph mining by learning from search history. ICDE; 2011. p. 207–18.
Zhu Y, Yu J, Cheng H, et al. Graph classification: a diversified discriminative feature selection approach. CIKM; 2012. p. 205–14.
Huang G, Zhu Q, Siew CK. Extreme learning machine: a new learning scheme of feedforward neural networks. IJCNN; 2004. p. 985–90.
Huang G, Liang N, Rong H, et al. On-line sequential extreme learning machine. IASTED; 2005. p. 232–7.
Huang G, Zhu Q, Siew CK, et al. Extreme learning machine: theory and applications. Neurocomputing 2006;70(1–3):489–501.
Huang G, Chen L. Convex incremental extreme learning machine. Neurocomputing 2007;70(16–18):3056–62.
Huang G, Chen L. Enhanced random search based incremental extreme learning machine. Neurocomputing 2008;71(16–18):3460–8.
Huang G, Ding X, Zhou H. Optimization method based extreme learning machine for classification. Neurocomputing 2010;74(1–3):155–63.
Huang G, Wang D, Lan Y. Extreme learning machines: a survey. Int J Mach Learn Cybern 2011;2(2): 107–22.
Huang G, Zhou H, Ding X, et al. Extreme learning machine for regression and multiclass classification. Trans Syst Man Cybern B Cybern 2012;42(2):513–29.
Huang G, Wang D. Advances in extreme learning machines (ELM2011). Neurocomputing 2013;102:1–2.
Huang G. An insight into extreme learning machines: random neurons, random features and Kernels. Cogn Comput 2014;6(3):376–90.
Xin J, Wang Z, Chen C, et al. ELM∗: distributed extreme learning machine with MapReduce. World Wide Web 2014;17(5):1189–204.
Xin J, Wang Z, Qu L, et al. Elastic extreme learning machine for big data classification. Neurocomputing 2015;149(Part A):464–71.
Bi X, Zhao X, Wang G, et al. Distributed extreme learning machine with Kernels based on MapReduce. Neurocomputing 2015;149:456–63.
Wang B, Huang S, Qiu J, et al. Parallel online sequential extreme learning machine based on MapReduce. Neurocomputing 2015;149:224–32.
Huang G, Bai X, Kasun LLC, et al. Local receptive fields based extreme learning machine. Comput Intell Mag 2015;10(2):18–29.
Huang G. What are extreme learning machines? Filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cogn Comput 2015;7(3):263–78.
Kasun L, Yang Y, Huang G, et al. Dimension reduction with extreme learning machine. IEEE Trans Image Process 2016;25(8):3906–18.
Cui D, Huang G, Liu T. Smile detection using pair-wise distance vector and extreme learning machine. International joint conference on neural networks (IJCNN); 2016. p. 2298–305.
Tang J, Deng C, Huang G. Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learning Syst 2016;27(4):809–21.
Man Z, Huang G. Guest editorial: special issue on Extreme learning machine and applications(I). Neural Comput & Applic 2016;27(1):1–12.
Man Z, Huang G. Guest editorial: special issue on extreme learning machine and applications(II). Neural Comput & Applic 2016;27(2):253–4.
Wang G, YZ, Wang D. A protein secondary structure prediction framework based on the extreme learning machine. Neurocomputing 2008;72(1–3):262–8.
Sun Y, Yuan Y, Wang G. An OS-ELM based distributed ensemble classification framework in P2P networks. Neurocomputing 2011;74(16):2438–43.
Zhao X, Wang G, Bin X, et al. XML document classification based on ELM. Neurocomputing 2011; 74(16):2444–51.
Akusok A, Bjork KM, Miche Y, et al. High-performance extreme learning machines: a complete toolbox for big data applications. IEEE Access 2015;3:1011–25.
Wang Z, Zhao Y, Wang G, et al. On extending extreme learning machine to non-redundant synergy pattern based graph classification. Neurocomputing 2015;149:330–9.
Han D, Hu Y, Ai S, et al. Uncertain graph classification based on extreme learning machine. Cogn Comput 2015;7(3):346–58.
Cao K, Wang G, Han D, et al. Classification of uncertain data streams based on extreme learning machine. Cogn Comput 2015;7(1):150–60.
Spark. http://spark.apache.org/. 2017.
Basic ELM algorithms. http://www.ntu.edu.sg/home/egbhuang/elm_codes.html. 2017.
Wikipedia. https://en.wikipedia.org/wiki/Main_Page. 2017.
Naive Bayes EM algorithm. http://www.mloss.org/software/view/357/.
Joachims T. SVM-light http://svmlight.joachims.org/. 2017.
Zhou ZH, Sun YY, Li YF. Multi-instance learning by treating instances as Non-I.I.D. samples. International conference on machine learning; 2009. p. 1249–56.
Funding
The work is partially supported by the National Natural Science Foundation of China (No. 61702381, No. 61772124, No. 61872070), the Hubei Natural Science Foundation (No. 2017CFB196), the Scientific Research Foundation of Wuhan University of Science and Technology (2017xz015), and the Fundamental Research Funds for the Central Universities (150402002, 171605001). Jia Xu is supported the Key Projects of Higher Education Undergraduate Teaching Reform Project in Guangxi (No. 2017JGZ103) and the Scientific Research Foundation of GuangXi University (No. XGZ141182).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interests
The authors declare that they have no conflict of interest.
Ethical Approval
This article does not contain any studies with human participants or animals performed by any of the authors.
Informed Consent
Informed consent was obtained from all individual participants included in the study.
Rights and permissions
About this article
Cite this article
Pang, J., Zhao, Y., Xu, J. et al. Super-Graph Classification Based on Composite Subgraph Features and Extreme Learning Machine. Cogn Comput 10, 922–936 (2018). https://doi.org/10.1007/s12559-018-9601-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12559-018-9601-x