Skip to main content
Log in

N-semble: neural network based ensemble approach

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Output can be predicted from experimental or achieve data by using machine learning models like random forest, artificial neural network, decision tree and many more models. Each model has its own limitations and advantages. To improve model’s accuracy, outcome of multiple models can be combined for prediction. The way of combining the predictions of different models is the key to increase the overall accuracy. In this work, a new approach is discussed to create an ensemble model on a regression dataset which overcomes the limitation of classical ensemble approach. Artificial neural network is trained in a special way to ensemble the predictions of multiple models. The comparison between N-semble and classical model is performed on various evaluation measures and it is concluded that N-semble outperforms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Sourceforge. http://bit.ly/RF-PCP-DataSets

  2. Liu B, Wang S, Dong Q, Li S, Liu X (2016) Identification of DNA-binding proteins by combining auto-cross covariance transformation and ensemble learning. IEEE Trans Nano Biosci 15(4):328–334

    Article  Google Scholar 

  3. Liu B, Long R, Chou KC (2016) iDHS-EL: identifying DNase I hypersensitive sites by fusing three different modes of pseudo nucleotide composition into an ensemble learning framework. Bioinformatics 32(16):2411–2418

    Article  Google Scholar 

  4. Liu B, Long R, Chou KC (2017) iRSpot-EL: identify recombination spots with an ensemble learning approach. Bioinformatics 33(1):35–41

    Article  Google Scholar 

  5. Liu B, Zhang D, Xu R, Xu J, Wang X, Chen Q, Dong Q, Chou KC (2014) Combining evolutionary information extracted from frequency profiles with sequence-based kernels for protein remote homology detection. Bioinformatics 30(4):472–479

    Article  Google Scholar 

  6. Cortes C, Mohri M, Rostamizadeh A (2016) Two-stage learning kernel algorithms. Proc 27th Int Conf Mach Learning (ICML-10) 62(3):1485–1500

  7. Varma M, Bodla RB (2009) More generality in efficient multiple kernel learning. In: Proceedings of the 26th annual international conference on machine learning. ACM

  8. Zhou Z-H, Jiang Y (2004) NeC4.5: neural ensemble based C4.5. IEEE Trans Knowl Data Eng 16(6):770–773

    Article  Google Scholar 

  9. Pantola P, Bala A, Rana PS (2015) Consensus based ensemble model for spam detection. 2015 international conference on advances in computing, communications and informatics (ICACCI)

  10. Rana PS, Sharma H, Bhattacharya M, Shukla A (2015) Quality assessment of modelled protein structure using physicochemical properties. J Bioinf Comput Biol 13(2):1550005

    Article  Google Scholar 

  11. Scornet E (2016) Random forests and kernel methods. IEEE Trans Inf Theory 62(3):1485–1500

    Article  MathSciNet  MATH  Google Scholar 

  12. Ma X, Guo J, Xiao K, Sun X (2015) PRBP: prediction of RNA-binding proteins using a random forest algorithm combined with an RNA-binding residue predictor. IEEE/ACM Trans Comput Biol Bioinf 12(6):1385–1393

    Article  Google Scholar 

  13. Lin H et al (2015) Weighing fusion method for truck scales based on prior knowledge and neural network ensembles. IEEE Trans Instrum Meas 14(6):649–659

    Google Scholar 

  14. Xia J, Liao W, Chanussot J, Du P, Song G, Philips W (2015) Improving random forest with ensemble of features and semi supervised feature extraction. IEEE Geosci Remote Sens Lett 12(7):1471–1475

    Article  Google Scholar 

  15. Dai HL (2015) Imbalanced protein data classification using ensemble FTM-SVM. IEEE Trans Nanobiosci 14(4):350–359

    Article  Google Scholar 

  16. Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 12(10):993–1001

    Article  Google Scholar 

  17. Dai HL (2015) Imbalanced protein data classification using ensemble FTM-SVM. IEEE Trans Nanobiosci 14(4):350–359

    Article  Google Scholar 

  18. Phan H, Maab M, Mazur R, Mertins A (2015) Random regression forests for acoustic event detection and classification. IEEE/ACM Trans Audio Speech Lang Process 23(1):20–31

    Article  Google Scholar 

  19. Dehzangi A (2013) A combination of feature extraction methods with an ensemble of different classifiers for protein structural class prediction problem. IEEE/ACM Trans Comput Biol Bioinf 10(3):564–575

    Article  Google Scholar 

  20. Heaton J (2008) Introduction to neural networks with Java Heaton Res Inc

  21. Wei L, Liao M, Gao X, Zou Q (2015) Enhanced protein fold prediction method through a novel feature extraction technique. IEEE Trans Nanobiosci 14(6):649–659

    Article  Google Scholar 

  22. Wang Xizhao, Xing Hong-Jie, Li Yan et al (2015) A study on relationship between generalization abilities and fuzziness of base classifiers in ensemble learning. IEEE Trans Fuzzy Syst 23(5):1638–1654

    Article  Google Scholar 

  23. Wang Xizhao, Aamir Rana, Ai-Min Fu (2015) Fuzziness based sample categorization for classifier performance improvement. J Intell Fuzzy Syst 29:1185–1196

    Article  MathSciNet  Google Scholar 

  24. Ashfaq RAR et al, Wang XZ, Huang JZ, Abbas H, He YL (2017) Fuzziness based semi-supervised learning approach for intrusion detection system. Inf Sci 378:484–497

    Article  Google Scholar 

  25. Xizhao Wang, Tianlun Zhang, Ran Wang (2017) Non-iterative deep learning: incorporating restricted Boltzmann machine into multilayer random weight neural networks. IEEE Trans Syst Man Cybern Syst. doi:10.1109/TSMC.2017.2701419

  26. Rulequest: data mining with cubist. www.rulequest.com/cubist-info.html

  27. Documentation on Xgboost. https://goo.gl/7nttEF

  28. Liam A, Wiener M (2002) Classification and regression by randomForest. News 2(3):1822

    Google Scholar 

  29. K-fold validation, website: scikit-learn. http://goo.gl/JXknN8

  30. XgBoost website: CRAN.R-Project. http://goo.gl/ulWSI3

  31. CART website: CRAN.R-Project. http://goo.gl/ulWSI3

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rishith Rayal.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rayal, R., Khanna, D., Sandhu, J.K. et al. N-semble: neural network based ensemble approach. Int. J. Mach. Learn. & Cyber. 10, 337–345 (2019). https://doi.org/10.1007/s13042-017-0718-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-017-0718-0

Keywords

Navigation