Skip to main content
Log in

The Improved Ordering-Based Search Method Incorporating with Ensemble Learning

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

The Bayesian network provides a useful way to deal with uncertain information, which helps researchers to better understand the human cognitive process. The foundation of the Bayesian network focuses on identifying the qualitative relations between variables, which is also called structure learning. Local search in the ordering space is an effective method for learning the structure of large-scale Bayesian networks. However, the existing algorithms tend to the local optimum and stop searching for superior solutions. To tackle the problem, random perturbations are applied to the local optimum without specific strategies, resulting in many meaningless restarts that sacrifice much time but still fail to improve the results. As an extension of the local search, simulated annealing stochastically searches the solution spaces and selects relatively poor solutions with a certain probability. This paper proposes a method based on simulated annealing to learn Bayesian network structure in the ordering space, which expands the search scope by probabilistically accepting poorer solutions. Moreover, we improve simulated annealing by adding a memory module and modifying the termination condition. The memory module records the optimal solution before accepting a worse solution, which avoids losing the possible global optimal solution. The new termination condition is related to the quality of the search results, which reduces many redundant searches. Besides, we design a new restart strategy based on ensemble learning. When the search traps in the local optimum, a new ordering is obtained to restart the search by perturbing the current ordering with constraints. The constraints are generated by the results of ensemble learning on multiple structures, which help the algorithm approach the global optimum solution. Experimental results show that our proposed methods improve the accuracy and efficiency in learning the optimal structure over the benchmarks compared to the state-of-the-art algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Algorithm 1
Fig. 4
Algorithm 2
Fig. 5
Fig. 6
Fig. 7
Algorithm 3
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data Availability

The data used in this manuscript are generated from public networks from https://www.bnlearn.com/bnrespository/.

Notes

  1. http://www.bnlearn.com/bnrepository/.

  2. https://pgmpy.org/.

  3. https://github.com/cmu-phil/tetrad.

  4. https://github.com/fishmoon1234/DAG-GNN.

References

  1. Kyburg HE. Probabilistic reasoning in intelligent systems. J Philos. 1988;88(8):434–7.

    Google Scholar 

  2. Zhang L, Wu X, Skibniewski MJ, Zhong J, Lu Y. Bayesian-network-based safety risk analysis in construction projects. Reliab Eng Syst Saf. 2014;131(nov.):29–39.

    Article  Google Scholar 

  3. Antonik P, Marsal N, Brunner D, Rontani D. Bayesian optimisation of large-scale photonic reservoir computers. CoRR abs/2004.02535. 2020. Available from: https://arxiv.org/abs/2004.02535.

  4. Yang Z, Rong H, Wong PK, Angelov PP, Vong C, Chiu C, Yang Z. A novel multiple feature-based engine knock detection system using sparse Bayesian extreme learning machine. Cogn Comput. 2022;14(2):828–51. https://doi.org/10.1007/s12559-021-09945-3.

    Article  Google Scholar 

  5. Liu B, He L, Li Y, Zhe S, Xu Z. NeuralCP: Bayesian multiway data analysis with neural tensor decomposition. Cogn Comput. 2018;10(6):1051–61. https://doi.org/10.1007/s12559-018-9587-4.

    Article  Google Scholar 

  6. Chickering DM. Learning Bayesian networks is NP-complete. Networks. 1996;112(2):121–30.

    MathSciNet  Google Scholar 

  7. Scanagatta M, Salmerón A, Stella F. A survey on Bayesian network structure learning from data. Prog Artif Intell. 2019;8(4):425–39.

    Article  Google Scholar 

  8. Singh AP, Moore AW. Finding optimal Bayesian networks by dynamic programming. Princeton: Citeseer; 2005.

  9. Malone B, Yuan C. Evaluating anytime algorithms for learning optimal Bayesian networks. arXiv:1309.6844 [Preprint]. 2013.

  10. Malone BM, Yuan C, Hansen EA, Bridges S. Improving the scalability of optimal Bayesian network learning with external-memory frontier breadth-first branch and bound search. CoRR abs/1202.3744 [Preprint]. 2012. Available from: https://arxiv.org/abs/1202.3744.

  11. Heckerman D, Geiger D, Chickering DM. Learning Bayesian networks: the combination of knowledge and statistical data. Mach Learn. 1995;20(3):197–243.

    Article  Google Scholar 

  12. Larranaga P, Kuijpers CM, Murga RH, Yurramendi Y. Learning Bayesian network structures by searching for the best ordering with genetic algorithms. IEEE Trans Syst Man Cybern A Syst Hum. 1996;26(4):487–93.

    Article  Google Scholar 

  13. Song C, Zhang Y, Xu Z. An improved structure learning algorithm of Bayesian network based on the hesitant fuzzy information flow. Appl Soft Comput. 2019;82:105549.

    Article  Google Scholar 

  14. Chickering DM, Heckerman D, Meek C. A Bayesian approach to learning Bayesian networks with local structure. In: Proceedings of the Thirteenth Conference on Uncertainty in Artificial Intelligence. Brown University, Providence: Morgan Kaufmann; 1997. pp. 80–9.

  15. Chickering DM. Learning equivalence classes of Bayesian-network structures. J Mach Learn Res. 2002;2:445–98.

    MathSciNet  Google Scholar 

  16. Teyssier, M. Ordering-based search: a simple and effective algorithm for learning Bayesian networks. In: Proceedings of the Twenty-first Annual Conference on Uncertainty in Artificial Intelligence, 2005. Edinburgh: AUAI Press; 2005. pp. 584–90.

  17. Lee C, Beek PV. Metaheuristics for score-and-search Bayesian network structure learning. In: Canadian Conference on Artificial Intelligence. Edmonton: Springer; 2017. pp. 129–41.

  18. Scanagatta M, Corani G, Zaffalon M. Improved local search in Bayesian networks structure learning. In: Advanced Methodologies for Bayesian Networks. Kyoto: PMLR; 2017. pp. 45–56.

  19. Schwarz G. Estimating the dimension of a model. Ann Stat. 1978;6(2):461–4.

    Article  MathSciNet  Google Scholar 

  20. Bozdogan H. Model selection and Akaike’s Information Criterion (AIC): the general theory and its analytical extensions. Psychometrika. 1987;52(3):345–70.

    Article  MathSciNet  Google Scholar 

  21. Buntine W. Theory refinement on Bayesian networks. In: Uncertainty Proceedings 1991. Los Angeles: Morgan Kaufmann; 1991. pp. 52–60.

  22. Behjati S, Beigy H. An order-based algorithm for learning structure of Bayesian networks. In: International Conference on Probabilistic Graphical Models. Prague: PMLR; 2018. pp. 25–36.

  23. Cussens J, Järvisalo M, Korhonen JH, Bartlett M. Bayesian network structure learning with integer programming: polytopes, facets and complexity. J Artif Intell Res. 2017;58:185–229.

    Article  MathSciNet  Google Scholar 

  24. Cooper GF, Herskovits E. A Bayesian method for the induction of probabilistic networks from data. Mach Learn. 1992;9(4):309–47.

    Article  Google Scholar 

  25. Metropolis N, Rosenbluth AW, Rosenbluth MN, Teller AH, Teller E. Equation of state calculations by fast computing machines. J Chem Phys. 1953;21(6):1087–92.

    Article  Google Scholar 

  26. Scanagatta M, Corani G, De Campos CP, Zaffalon M. Approximate structure learning for large Bayesian networks. Mach Learn. 2018;107(8):1209–27.

    Article  MathSciNet  Google Scholar 

  27. Zhou Z-H, Wu J, Tang W. Ensembling neural networks: many could be better than all. Artif Intell. 2002;137(1–2):239–63.

    Article  MathSciNet  Google Scholar 

  28. Zhu L, Lian C, Zeng Z, Su Y. A broad learning system with ensemble and classification methods for multi-step-ahead wind speed prediction. Cogn Comput. 2020;12(3):654–66. https://doi.org/10.1007/s12559-019-09698-0.

    Article  Google Scholar 

  29. Sachnev V, Ramasamy S, Sundaram S, Kim HJ, Hwang HJ. A cognitive ensemble of extreme learning machines for steganalysis based on risk-sensitive hinge loss function. Cogn Comput. 2015;7(1):103–10. https://doi.org/10.1007/s12559-014-9268-x.

  30. Breiman L. Bagging predictors. Mach Learn. 1996;24(2):123–40.

  31. Hansen LK, Salamon P. Neural network ensembles. IEEE Trans Pattern Anal Mach Intell. 1990;12(10):993–1001.

    Article  Google Scholar 

  32. Wolpert DH. Stacked generalization. Neural Netw. 1992;5(2):241–59.

    Article  Google Scholar 

  33. Colombo D, Maathuis MH. Order-independent constraint-based causal structure learning. J Mach Learn Res. 2014;15(1):3741–82. https://doi.org/10.5555/2627435.2750365.

    Article  MathSciNet  Google Scholar 

  34. Ramsey J, Glymour M, Sanchez-Romero R, Glymour C. A million variables and more: the fast greedy equivalence search algorithm for learning high-dimensional graphical causal models, with an application to functional magnetic resonance images. Int J Data Sci Anal. 2017;3(2):121–9.

    Article  Google Scholar 

  35. Yu Y, Chen J, Gao T, Yu M. DAG-GNN: DAG structure learning with graph neural networks. In: International Conference on Machine Learning. Long Beach: PMLR; 2019. pp. 7154–63.

Download references

Author information

Authors and Affiliations

Authors

Contributions

HW: conceptualization, writing—original draft preparation. ZW: methodology, supervision. RZ: validation, software. XL: software. XG: writing—reviewing and editing.

Corresponding author

Correspondence to Zidong Wang.

Ethics declarations

Conflict of Interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, H., Wang, Z., Zhong, R. et al. The Improved Ordering-Based Search Method Incorporating with Ensemble Learning. Cogn Comput 16, 852–876 (2024). https://doi.org/10.1007/s12559-024-10251-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-024-10251-x

Keywords