Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Review Article
  • Published:

Ensemble deep learning in bioinformatics

Abstract

The remarkable flexibility and adaptability of ensemble methods and deep learning models have led to the proliferation of their application in bioinformatics research. Traditionally, these two machine learning techniques have largely been treated as independent methodologies in bioinformatics applications. However, the recent emergence of ensemble deep learning—wherein the two machine learning techniques are combined to achieve synergistic improvements in model accuracy, stability and reproducibility—has prompted a new wave of research and application. Here, we share recent key developments in ensemble deep learning and look at how their contribution has benefited a wide range of bioinformatics research from basic sequence analysis to systems biology. While the application of ensemble deep learning in bioinformatics is diverse and multifaceted, we identify and discuss the common challenges and opportunities in the context of bioinformatics research. We hope this Review Article will bring together the broader community of machine learning researchers, bioinformaticians and biologists to foster future research and development in ensemble deep learning, and inspire novel bioinformatics applications that are unattainable by traditional methods.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: The focus of this Review Article and classic ensemble methods.
Fig. 2: Typical ensemble deep learning frameworks in supervised and unsupervised learning.

Similar content being viewed by others

References

  1. Larranaga, P. et al. Machine learning in bioinformatics. Briefings Bioinform. 7, 86–112 (2006).

    Article  Google Scholar 

  2. Eraslan, G., Avsec, Ž., Gagneur, J. & Theis, F. J. Deep learning: new computational modelling techniques for genomics. Nat. Rev. Genet. 20, 389–403 (2019).

    Article  Google Scholar 

  3. Camacho, D. M., Collins, K. M., Powers, R. K., Costello, J. C. & Collins, J. J. Next-generation machine learning for biological networks. Cell 173, 1581–1592 (2018).

    Article  Google Scholar 

  4. Hansen, L. K. & Salamon, P. Neural network ensembles. IEEE Trans. Pattern Anal. Mach. 12, 993–1001 (1990).

    Article  Google Scholar 

  5. Yang, P., Hwa Yang, Y., Zhou, B. B. & Zomaya, A. Y. A review of ensemble methods in bioinformatics. Curr. Bioinform. 5, 296–308 (2010).

    Article  Google Scholar 

  6. Min, S., Lee, B. & Yoon, S. Deep learning in bioinformatics. Briefings Bioinform. 18, 851–869 (2017).

    Google Scholar 

  7. Dietterich, T. G. Ensemble methods in machine learning. In International Workshop on Multiple Classifier Systems 1–15 (Springer, 2000).

  8. Breiman, L. Bagging predictors. Mach. Learn. 24, 123–140 (1996).

    MATH  Google Scholar 

  9. Schapire, R. E., Freund, Y., Bartlett, P. & Lee, W. S. Boosting the margin: a new explanation for the effectiveness of voting methods. Ann. Stat. 26, 1651–1686 (1998).

    Article  MathSciNet  MATH  Google Scholar 

  10. Wolpert, D. H. Stacked generalization. Neural Netw. 5, 241–259 (1992).

    Article  Google Scholar 

  11. Vega-Pons, S. & Ruiz-Shulcloper, J. A survey of clustering ensemble algorithms. Int. J. Pattern Recogn. 25, 337–372 (2011).

    Article  MathSciNet  Google Scholar 

  12. Altman, N. & Krzywinski, M. Points of significance: ensemble methods: bagging and random forests. Nat. Methods 14, 933–935 (2017).

    Article  Google Scholar 

  13. Schmidhuber, J. Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015).

    Article  Google Scholar 

  14. Rumelhart, D. E., Hinton, G. E. & Williams, R. J. Learning representations by back-propagating errors. Nature 323, 533–536 (1986).

    Article  MATH  Google Scholar 

  15. Krizhevsky, A., Sutskever, I. & Hinton, G. E. Imagenet classification with deep convolutional neural networks. In Proc. 26th Int. Conf. Advances in Neural Information Processing Systems 1097–1105 (NIPS, 2012).

  16. Williams, R. J. & Zipser, D. A learning algorithm for continually running fully recurrent neural networks. Neural Comput. 1, 270–280 (1989).

    Article  Google Scholar 

  17. Hochreiter, S. & Schmidhuber, J. Long short-term memory. Neural Comput. 9, 1735–1780 (1997).

    Article  Google Scholar 

  18. Cho, K. et al. Learning phrase representations using RNN encoder–decoder for statistical machine translation. In Proc. 2014 Conf. Empirical Methods in Natural Language Processing 1724–1734 (EMNLP, 2014).

  19. He, K., Zhang, X., Ren, S. & Sun, J. Deep residual learning for image recognition. In Proc. 2016 IEEE Conf. Computer Vision and Pattern Recognition 770–778 (IEEE, 2016).

  20. Baldi, P. Autoencoders, unsupervised learning, and deep architectures. In Proc. ICML Workshop on Unsupervised and Transfer learning 37–49 (ICML, 2012).

  21. LeCun, Y., Bengio, Y. & Hinton, G. Deep learning. Nature 521, 436–444 (2015).

    Article  Google Scholar 

  22. Ju, C., Bibaut, A. & van der Laan, M. The relative performance of ensemble methods with deep convolutional neural networks for image classification. J. Appl. Stat. 45, 2800–2818 (2018).

    Article  MathSciNet  Google Scholar 

  23. Lee, S., Purushwalkam, S., Cogswell, M., Crandall, D. & Batra, D. Why M heads are better than one: training a diverse ensemble of deep networks. Preprint at https://arxiv.org/abs/1511.06314 (2015).

  24. Granitto, P. M., Verdes, P. F. & Ceccatto, H. A. Neural network ensembles: evaluation of aggregation algorithms. Artif. Intell. 163, 139–162 (2005).

    Article  MathSciNet  MATH  Google Scholar 

  25. Liu, Y. & Yao, X. Ensemble learning via negative correlation. Neural Netw. 12, 1399–1404 (1999).

    Article  Google Scholar 

  26. Lee, S. et al. Stochastic multiple choice learning for training diverse deep ensembles. In Proc. 30th Int. Conf. Advances in Neural Information Processing Systems 2119–2127 (NIPS, 2016).

  27. Hinton, G., Vinyals, O. & Dean, J. Distilling the knowledge in a neural network. Preprint at http://arxiv.org/abs/1503.02531 (2015).

  28. Shen, Z., He, Z. & Xue, X. Meal: multi-model ensemble via adversarial learning. In Proc. AAAI Conf. Artificial Intelligence Vol. 33 4886–4893 (AAAI, 2019).

  29. Parisotto, E., Ba, J. & Salakhutdinov, R. Actor-mimic: deep multitask and transfer reinforcement learning. In Proc. Int. Conf. Learning Representations (ICLR, 2016).

  30. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I. & Salakhutdinov, R. Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15, 1929–1958 (2014).

    MathSciNet  MATH  Google Scholar 

  31. Baldi, P. & Sadowski, P. J. Understanding dropout. In Proc. 27th Int. Conf. Advances in Neural Information Processing Systems 2814–2822 (NIPS, 2013).

  32. Hara, K., Saitoh, D. & Shouno, H. Analysis of dropout learning regarded as ensemble learning. In Proc. 25th Int. Conf. Artificial Neural Networks 72–79 (ICANN, 2016).

  33. Huang, G., Sun, Y., Liu, Z., Sedra, D. & Weinberger, K. Q. Deep networks with stochastic depth. In 14th European Conf. Computer Vision 646–661 (Springer, 2016).

  34. Singh, S., Hoiem, D. & Forsyth, D. Swapout: learning an ensemble of deep architectures. In Proc. 30th Int. Conf. Advances in Neural Information Processing Systems 28–36 (NIPS, 2016).

  35. Huang, G. et al. Snapshot ensembles: train 1, get M for free. Preprint at https://arxiv.org/abs/1704.00109 (2017).

  36. Han, B., Sim, J. & Adam, H. Branchout: regularization for online ensemble tracking with convolutional neural networks. In Proc. IEEE Conf. Computer Vision and Pattern Recognition 3356–3365 (IEEE, 2017).

  37. Wang, X., Bao, A., Cheng, Y. & Yu, Q. Multipath ensemble convolutional neural network. IEEE Trans. Emerg. Topics Comput. https://doi.org/10.1109/TETCI.2018.2877154 (2018).

  38. Zhu, X., Gong, S. et al. Knowledge distillation by on-the-fly native ensemble. In Proc. 32nd Int. Conf. Advances in Neural Information Processing Systems 7517–7527 (NIPS, 2018).

  39. Geddes, T. A. et al. Autoencoder-based cluster ensembles for single-cell RNA-seq data analysis. BMC Bioinform. 20, 660 (2019).

    Article  Google Scholar 

  40. Shao, H., Jiang, H., Lin, Y. & Li, X. A novel method for intelligent fault diagnosis of rolling bearings using ensemble deep auto-encoders. Mech. Syst. Signal Process. 102, 278–297 (2018).

    Article  Google Scholar 

  41. Wang, W., Arora, R., Livescu, K. & Bilmes, J. On deep multi-view representation learning. In Proc. 32nd Int. Conf. International Conference on Machine Learning 1083–1092 (ICML, 2015).

  42. Huang, Z. et al. Multi-view spectral clustering network. In Proc. 28th Int. Joint Conf. Artificial Intelligence 2563–2569 (IJCAI, 2019).

  43. Vincent, P., Larochelle, H., Bengio, Y. & Manzagol, P.-A. Extracting and composing robust features with denoising autoencoders. In Proc. 25th Int. Conf. Machine Learning 1096–1103 (ICML, 2008).

  44. Bachman, P., Alsharif, O. & Precup, D. Learning with pseudo-ensembles. In Proc. 28th Int. Conf. Advances in Neural Information Processing Systems 3365–3373 (NIPS, 2014).

  45. Antelmi, L., Ayache, N., Robert, P. & Lorenzi, M. Sparse multi-channel variational autoencoder for the joint analysis of heterogeneous data. In Proc. 36th Int. Conf. Machine Learning 302–311 (ICML, 2019).

  46. Vincent, P., Larochelle, H., Lajoie, I., Bengio, Y. & Manzagol, P.-A. Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion. J. Mach. Learn. Res. 11, 3371–3408 (2010).

    MathSciNet  MATH  Google Scholar 

  47. Geman, S., Bienenstock, E. & Doursat, R. Neural networks and the bias/variance dilemma. Neural Comput. 4, 1–58 (1992).

    Article  Google Scholar 

  48. Bengio, Y. Learning deep architectures for AI. Found. Trends Mach. Learn. 2, 1–127 (2009).

    Article  MATH  Google Scholar 

  49. Keskar, N. S., Nocedal, J., Tang, P. T. P., Mudigere, D. & Smelyanskiy, M. On large-batch training for deep learning: generalization gap and sharp minima. In Proc. 5th Int. Conf. Learning Representations (ICLR, 2017).

  50. Zhao, D., Yu, G., Xu, P. & Luo, M. Equivalence between dropout and data augmentation: a mathematical check. Neural Netw. 115, 82–89 (2019).

    Article  MATH  Google Scholar 

  51. Bartoszewicz, J. M., Seidel, A., Rentzsch, R. & Renard, B. Y. Deepac: predicting pathogenic potential of novel DNA with reverse-complement neural networks. Bioinformatics 36, 81–89 (2020).

    Google Scholar 

  52. Cao, Z., Pan, X., Yang, Y., Huang, Y. & Shen, H.-B. The lncLocator: a subcellular localization predictor for long non-coding RNAs based on a stacked ensemble classifier. Bioinformatics 34, 2185–2194 (2018).

    Article  Google Scholar 

  53. Zhang, S., Hu, H., Jiang, T., Zhang, L. & Zeng, J. TITER: predicting translation initiation sites by deep learning. Bioinformatics 33, i234–i242 (2017).

    Article  Google Scholar 

  54. Zhang, Y., Qiao, S., Ji, S. & Zhou, J. Ensemble-CNN: predicting DNA binding sites in protein sequences by an ensemble deep learning method. In Proc. 14th Int. Conf. Intelligent Computing 301–306 (ICIC, 2018).

  55. He, F. et al. Protein ubiquitylation and sumoylation site prediction based on ensemble and transfer learning. In Proc. 2019 IEEE Int. Conf. Bioinformatics and Biomedicine 117–123 (IEEE, 2019).

  56. Feuk, L., Carson, A. R. & Scherer, S. W. Structural variation in the human genome. Nat. Rev. Genet. 7, 85–97 (2006).

    Article  Google Scholar 

  57. Portela, A. & Esteller, M. Epigenetic modifications and human disease. Nat. Biotechnol. 28, 1057–1068 (2010).

    Article  Google Scholar 

  58. Karim, M. R., Rahman, A., Jares, J. B., Decker, S. & Beyan, O. A snapshot neural ensemble method for cancer-type prediction based on copy number variations. Neural Comput. Appl. https://doi.org/10.1007/s00521-019-04616-9 (2019).

  59. Erhan, D. et al. Why does unsupervised pre-training help deep learning? J. Mach. Learn. Res 11, 625–660 (2010).

    MathSciNet  MATH  Google Scholar 

  60. Angermueller, C., Lee, H. J., Reik, W. & Stegle, O. DeepCpG: accurate prediction of single-cell DNA methylation states using deep learning. Genome Biol. 18, 67 (2017).

    Article  Google Scholar 

  61. Hu, H. et al. Deephint: understanding HIV-1 integration via deep learning with attention. Bioinformatics 35, 1660–1667 (2019).

    Article  Google Scholar 

  62. Bahdanau, D., Cho, K. & Bengio, Y. Neural machine translation by jointly learning to align and translate. Preprint at https://arxiv.org/abs/1409.0473 (2014).

  63. Yang, Y. H. & Speed, T. Design issues for cDNA microarray experiments. Nat. Rev. Genet. 3, 579–588 (2002).

    Article  Google Scholar 

  64. Ozsolak, F. & Milos, P. M. RNA sequencing: advances, challenges and opportunities. Nat. Rev. Genet. 12, 87–98 (2011).

    Article  Google Scholar 

  65. Kolodziejczyk, A. A., Kim, J. K., Svensson, V., Marioni, J. C. & Teichmann, S. A. The technology and biology of single-cell RNA sequencing. Mol. Cell 58, 610–620 (2015).

    Article  Google Scholar 

  66. Grewal, J. K. et al. Application of a neural network whole transcriptome-based pan-cancer method for diagnosis of primary and metastatic cancers. JAMA Netw. Open 2, e192597 (2019).

    Article  Google Scholar 

  67. Xiao, Y., Wu, J., Lin, Z. & Zhao, X. A deep learning-based multi-model ensemble method for cancer prediction. Comput. Methods Prog. Biomed. 153, 1–9 (2018).

    Article  Google Scholar 

  68. West, M. D. et al. Use of deep neural network ensembles to identify embryonic-fetal transition markers: repression of COX7A1 in embryonic and cancer cells. Oncotarget 9, 7796–7811 (2018).

    Article  Google Scholar 

  69. Tan, J. et al. Unsupervised extraction of stable expression signatures from public compendia with an ensemble of neural networks. Cell Syst. 5, 63–71 (2017).

    Article  Google Scholar 

  70. Lee, D., Redfern, O. & Orengo, C. Predicting protein function from sequence and structure. Nat. Rev. Mol. Cell Biol. 8, 995–1005 (2007).

    Article  Google Scholar 

  71. Li, Z. & Yu, Y. Protein secondary structure prediction using cascaded convolutional and recurrent neural networks. In Proc. 25th Int. Joint Conf. Artificial Intelligence 2560–2567 (AAAI, 2016).

  72. Torrisi, M., Kaleel, M. & Pollastri, G. Deeper profiles and cascaded recurrent and convolutional neural networks for state-of-the-art protein secondary structure prediction. Sci. Rep. 9, 12374 (2019).

    Article  Google Scholar 

  73. Singh, J., Hanson, J., Paliwal, K. & Zhou, Y. RNA secondary structure prediction using an ensemble of two-dimensional deep neural networks and transfer learning. Nat. Commun. 10, 5407 (2019).

    Article  Google Scholar 

  74. Zhang, B., Li, J. & Lü, Q. Prediction of 8-state protein secondary structures by a novel deep learning architecture. BMC Bioinform. 19, 293 (2018).

    Article  Google Scholar 

  75. Zacharaki, E. I. Prediction of protein function using a deep convolutional neural network ensemble. PeerJ Comput. Sci. 3, e124 (2017).

    Article  Google Scholar 

  76. Singh, J. et al. Detecting proline and non-proline cis isomers in protein structures from sequences using deep residual ensemble learning. J. Chem. Inf. Model. 58, 2033–2042 (2018).

    Article  Google Scholar 

  77. Walther, T. C. & Mann, M. Mass spectrometry-based proteomics in cell biology. J. Cell Biol. 190, 491–500 (2010).

    Article  Google Scholar 

  78. Cox, J. & Mann, M. Quantitative, high-resolution proteomics for data-driven systems biology. Annu. Rev. Biochem. 80, 273–299 (2011).

    Article  Google Scholar 

  79. Zohora, F. T. et al. DeepIso: a deep learning model for peptide feature detection from LC-MS map. Sci. Rep. 9, 17168 (2019).

    Article  Google Scholar 

  80. Demichev, V., Messner, C. B., Vernardis, S. I., Lilley, K. S. & Ralser, M. DIA-NN: neural networks and interference correction enable deep proteome coverage in high throughput. Nat. Methods 17, 41–44 (2020).

    Article  Google Scholar 

  81. Kitano, H. Computational systems biology. Nature 420, 206–210 (2002).

    Article  Google Scholar 

  82. Hu, Y. et al. ACME: pan-specific peptide–MHC class I binding prediction through attention-based deep neural networks. Bioinformatics 35, 4946–4954 (2019).

    Article  Google Scholar 

  83. Zhang, L., Yu, G., Xia, D. & Wang, J. Protein–protein interactions prediction based on ensemble deep neural networks. Neurocomputing 324, 10–19 (2019).

    Article  Google Scholar 

  84. Karimi, M., Wu, D., Wang, Z. & Shen, Y. DeepAffinity: interpretable deep learning of compound–protein affinity through unified recurrent and convolutional neural networks. Bioinformatics 35, 3329–3338 (2019).

    Article  Google Scholar 

  85. Hu, S. et al. Predicting drug-target interactions from drug structure and protein sequence using novel convolutional neural networks. BMC Bioinform. 20, 689 (2019).

    Article  Google Scholar 

  86. Yang, P. et al. Multi-omic profiling reveals dynamics of the phased progression of pluripotency. Cell Syst. 8, 427–445 (2019).

    Article  Google Scholar 

  87. Kim, H. J. et al. Transcriptional network dynamics during the progression of pluripotency revealed by integrative statistical learning. Nucl. Acids Res. 48, 1828–1842 (2020).

    Article  Google Scholar 

  88. Ramazzotti, D., Lal, A., Wang, B., Batzoglou, S. & Sidow, A. Multi-omic tumor data reveal diversity of molecular mechanisms that correlate with survival. Nat. Commun. 9, 4453 (2018).

    Article  Google Scholar 

  89. Liang, M., Li, Z., Chen, T. & Zeng, J. Integrative data analysis of multi-platform cancer data with a multimodal deep learning approach. IEEE/ACM Trans. Comput. Biol. Bioinform. 12, 928–937 (2014).

    Article  Google Scholar 

  90. Arefeen, A., Xiao, X. & Jiang, T. DeepPasta: deep neural network based polyadenylation site analysis. Bioinformatics 35, 4577–4585 (2019).

    Article  Google Scholar 

  91. Gala, R. et al. A coupled autoencoder approach for multi-modal analysis of cell types. In Proc. 33st Int. Conf. Advances in Neural Information Processing Systems 9263–9272 (NIPS, 2019).

  92. Zhang, X. et al. Integrated multi-omics analysis using variational autoencoders: application to pan-cancer classification. In Proc. 2019 IEEE Int. Conf. Bioinformatics and Biomedicine 765–769 (IEEE, 2019).

  93. Sharifi-Noghabi, H., Zolotareva, O., Collins, C. C. & Ester, M. MOLI: multi-omics late integration with deep neural networks for drug response prediction. Bioinformatics 35, i501–i509 (2019).

    Article  Google Scholar 

  94. Lu, Z. et al. The classification of gliomas based on a pyramid dilated convolution resnet model. Pattern Recognit. Lett. 133, 173–179 (2020).

    Article  Google Scholar 

  95. Codella, N. C. F. et al. Deep learning ensembles for melanoma recognition in dermoscopy images. IBM J. Res. Dev. 61, 5 (2017).

    Article  Google Scholar 

  96. Song, Y. et al. Accurate segmentation of cervical cytoplasm and nuclei based on multiscale convolutional network and graph partitioning. IEEE Trans. Biomed. Eng. 62, 2421–2433 (2015).

    Article  Google Scholar 

  97. Rasti, R., Teshnehlab, M. & Phung, S. L. Breast cancer diagnosis in DCE-MRI using mixture ensemble of convolutional neural networks. Pattern Recognit. 72, 381–390 (2017).

    Article  Google Scholar 

  98. Yuan, X., Xie, L. & Abouelenien, M. A regularized ensemble framework of deep learning for cancer detection from multi-class, imbalanced training data. Pattern Recognit. 77, 160–172 (2018).

    Article  Google Scholar 

  99. Xie, J., Xu, B. & Chuang, Z. Horizontal and vertical ensemble with deep representation for classification. Preprint at https://arxiv.org/abs/1306.2759 (2013).

  100. Dvornik, N., Schmid, C. & Mairal, J. Diversity with cooperation: ensemble methods for few-shot classification. In Proc. IEEE Int. Conf. Computer Vision 3723–3731 (IEEE, 2019).

  101. Bzdok, D., Nichols, T. E. & Smith, S. M. Towards algorithmic analytics for large-scale datasets. Nat. Mach. Intell. 1, 296–306 (2019).

    Article  Google Scholar 

  102. Yang, P. et al. Sample subset optimization techniques for imbalanced and ensemble learning problems in bioinformatics applications. IEEE Trans. Cybern. 44, 445–455 (2014).

    Article  Google Scholar 

  103. Yang, P. et al. AdaSampling for positive-unlabeled and label noise learning with bioinformatics applications. IEEE Trans. Cybern. 49, 1932–1943 (2019).

    Article  Google Scholar 

  104. Abeel, T., Helleputte, T., Van de Peer, Y., Dupont, P. & Saeys, Y. Robust biomarker identification for cancer diagnosis with ensemble feature selection methods. Bioinformatics 26, 392–398 (2010).

    Article  Google Scholar 

  105. Pusztai, L., Hatzis, C. & Andre, F. Reproducibility of research and preclinical validation: problems and solutions. Nat. Rev. Clin. Oncol. 10, 720–724 (2013).

    Article  Google Scholar 

  106. Dean, J. et al. Large scale distributed deep networks. In Proc. 26th Int. Conf. Advances in Neural Information Processing Systems 1223–1231 (NIPS, 2012).

  107. Smith, V., Chiang, C.-K., Sanjabi, M. & Talwalkar, A. S. Federated multi-task learning. In Proc. 31th Int. Conf. Advances in Neural Information Processing Systems 4424–4434 (NIPS, 2017).

Download references

Acknowledgements

P.Y. was supported by an Australian Research Council (ARC) Discovery Early Career Researcher Award (DE170100759) and a National Health and Medical Research Council Investigator Grant (1173469). J.Y.H.Y. and P.Y. were supported by an ARC Discovery Project (DP170100654). Y.C. was supported by a University of Sydney Postgraduate Award. T.A.G. was supported by a postgraduate scholarship from Research Training Program.

Author information

Authors and Affiliations

Authors

Contributions

P.Y. conceptualized this work. Y.C and P.Y. reviewed the literature and drafted the manuscript. All authors wrote and edited the Review Article.

Corresponding author

Correspondence to Pengyi Yang.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cao, Y., Geddes, T.A., Yang, J.Y.H. et al. Ensemble deep learning in bioinformatics. Nat Mach Intell 2, 500–508 (2020). https://doi.org/10.1038/s42256-020-0217-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s42256-020-0217-y

This article is cited by

Search

Quick links

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics