Abstract
Output can be predicted from experimental or achieve data by using machine learning models like random forest, artificial neural network, decision tree and many more models. Each model has its own limitations and advantages. To improve model’s accuracy, outcome of multiple models can be combined for prediction. The way of combining the predictions of different models is the key to increase the overall accuracy. In this work, a new approach is discussed to create an ensemble model on a regression dataset which overcomes the limitation of classical ensemble approach. Artificial neural network is trained in a special way to ensemble the predictions of multiple models. The comparison between N-semble and classical model is performed on various evaluation measures and it is concluded that N-semble outperforms.



Similar content being viewed by others
Explore related subjects
Discover the latest articles and news from researchers in related subjects, suggested using machine learning.References
Sourceforge. http://bit.ly/RF-PCP-DataSets
Liu B, Wang S, Dong Q, Li S, Liu X (2016) Identification of DNA-binding proteins by combining auto-cross covariance transformation and ensemble learning. IEEE Trans Nano Biosci 15(4):328–334
Liu B, Long R, Chou KC (2016) iDHS-EL: identifying DNase I hypersensitive sites by fusing three different modes of pseudo nucleotide composition into an ensemble learning framework. Bioinformatics 32(16):2411–2418
Liu B, Long R, Chou KC (2017) iRSpot-EL: identify recombination spots with an ensemble learning approach. Bioinformatics 33(1):35–41
Liu B, Zhang D, Xu R, Xu J, Wang X, Chen Q, Dong Q, Chou KC (2014) Combining evolutionary information extracted from frequency profiles with sequence-based kernels for protein remote homology detection. Bioinformatics 30(4):472–479
Cortes C, Mohri M, Rostamizadeh A (2016) Two-stage learning kernel algorithms. Proc 27th Int Conf Mach Learning (ICML-10) 62(3):1485–1500
Varma M, Bodla RB (2009) More generality in efficient multiple kernel learning. In: Proceedings of the 26th annual international conference on machine learning. ACM
Zhou Z-H, Jiang Y (2004) NeC4.5: neural ensemble based C4.5. IEEE Trans Knowl Data Eng 16(6):770–773
Pantola P, Bala A, Rana PS (2015) Consensus based ensemble model for spam detection. 2015 international conference on advances in computing, communications and informatics (ICACCI)
Rana PS, Sharma H, Bhattacharya M, Shukla A (2015) Quality assessment of modelled protein structure using physicochemical properties. J Bioinf Comput Biol 13(2):1550005
Scornet E (2016) Random forests and kernel methods. IEEE Trans Inf Theory 62(3):1485–1500
Ma X, Guo J, Xiao K, Sun X (2015) PRBP: prediction of RNA-binding proteins using a random forest algorithm combined with an RNA-binding residue predictor. IEEE/ACM Trans Comput Biol Bioinf 12(6):1385–1393
Lin H et al (2015) Weighing fusion method for truck scales based on prior knowledge and neural network ensembles. IEEE Trans Instrum Meas 14(6):649–659
Xia J, Liao W, Chanussot J, Du P, Song G, Philips W (2015) Improving random forest with ensemble of features and semi supervised feature extraction. IEEE Geosci Remote Sens Lett 12(7):1471–1475
Dai HL (2015) Imbalanced protein data classification using ensemble FTM-SVM. IEEE Trans Nanobiosci 14(4):350–359
Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 12(10):993–1001
Dai HL (2015) Imbalanced protein data classification using ensemble FTM-SVM. IEEE Trans Nanobiosci 14(4):350–359
Phan H, Maab M, Mazur R, Mertins A (2015) Random regression forests for acoustic event detection and classification. IEEE/ACM Trans Audio Speech Lang Process 23(1):20–31
Dehzangi A (2013) A combination of feature extraction methods with an ensemble of different classifiers for protein structural class prediction problem. IEEE/ACM Trans Comput Biol Bioinf 10(3):564–575
Heaton J (2008) Introduction to neural networks with Java Heaton Res Inc
Wei L, Liao M, Gao X, Zou Q (2015) Enhanced protein fold prediction method through a novel feature extraction technique. IEEE Trans Nanobiosci 14(6):649–659
Wang Xizhao, Xing Hong-Jie, Li Yan et al (2015) A study on relationship between generalization abilities and fuzziness of base classifiers in ensemble learning. IEEE Trans Fuzzy Syst 23(5):1638–1654
Wang Xizhao, Aamir Rana, Ai-Min Fu (2015) Fuzziness based sample categorization for classifier performance improvement. J Intell Fuzzy Syst 29:1185–1196
Ashfaq RAR et al, Wang XZ, Huang JZ, Abbas H, He YL (2017) Fuzziness based semi-supervised learning approach for intrusion detection system. Inf Sci 378:484–497
Xizhao Wang, Tianlun Zhang, Ran Wang (2017) Non-iterative deep learning: incorporating restricted Boltzmann machine into multilayer random weight neural networks. IEEE Trans Syst Man Cybern Syst. doi:10.1109/TSMC.2017.2701419
Rulequest: data mining with cubist. www.rulequest.com/cubist-info.html
Documentation on Xgboost. https://goo.gl/7nttEF
Liam A, Wiener M (2002) Classification and regression by randomForest. News 2(3):1822
K-fold validation, website: scikit-learn. http://goo.gl/JXknN8
XgBoost website: CRAN.R-Project. http://goo.gl/ulWSI3
CART website: CRAN.R-Project. http://goo.gl/ulWSI3
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Rayal, R., Khanna, D., Sandhu, J.K. et al. N-semble: neural network based ensemble approach. Int. J. Mach. Learn. & Cyber. 10, 337–345 (2019). https://doi.org/10.1007/s13042-017-0718-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-017-0718-0