ABSTRACT
Cars are an essential part of our everyday life. Nowadays we have a wide plethora of cars produced by a number of companies in all segments. The buyer has to consider a lot of factors while buying a car which makes the whole process a lot more difficult. So in this paper we have developed a method of ensemble learning to aid people in making the decision. Bagging, boosting and voting ensemble learning have been used to improve the precision rate i.e. accuracy of classification. Also we have performed class association rules to see if it performs better than collaborative filtering for suggesting item to the user.
- Abaya S.: Association Rule Mining based on Apriori Algorithm in Minimizing Candidate Generation. International Journal of Scientific & Engineering Research, Volume 3, Issue 7, July-2012.Google Scholar
- Breiman, L.: Bagging predictors, Machine Learning 24 (2), pp. 123--140. (1996) Google ScholarCross Ref
- Bohanec M. and Rajkovic V.: Knowledge acquisition and explanation for multi-attribute decision making. In 8th Intl Workshop on Expert Systems and their Applications, Avignon, France. pages 59--78, 1988.Google Scholar
- Claesen M., Smet F., Suykens J., Moor B.. EnsembleSVM: A Library for Ensemble Learning Using Support Vector Machines. Journal of Machine Learning Research, pages 141--145, 2014. Google ScholarDigital Library
- Lee J., Sun M., Lebanon G.: A Comparative Study of Collaborative Filtering Algorithms. arXiv:1205.3193v1 {cs.IR} 14 May 2012.Google Scholar
- Lizotte D.J., Madani O., Greiner R.: Budgeted Learning of Naive-Bayes Classifiers. UAI. 2003. Google ScholarDigital Library
- Sebban M., Nock R., Lallich. S.: Stopping Criterion for Boosting-Based Data Reduction Techniques: from Binary to Multiclass Problem. Journal of Machine Learning Research, 3. 2002. Google ScholarDigital Library
- Skurichina M., Duin R.P.W.: Bagging, Boosting and the Random Subspace Method for Linear Classifiers. Pattern Analysis & Applications, pages 121--135, 2002.Google Scholar
- Ting S.L., Ip W.H., Tsang A.H.C: Is Naïve Bayes a Good Classifier for Document Classification. International Journal of Software Engineering and Its Applications, Vol. 5, No. 3, July, 2011.Google Scholar
- Zupan B., Bohanec M., Bratko I., Demsar J.: Machine learning by function decomposition. ICML-97, Nashville, TN. 1997. Google ScholarDigital Library
Recommendations
Ensemble pruning using harmony search
HAIS'12: Proceedings of the 7th international conference on Hybrid Artificial Intelligent Systems - Volume Part IIIn recent years, a number of works proposing the combination of multiple classifiers to produce a single classification have been reported. The resulting classifier, referred to as an ensemble classifier, is generally found to be more accurate than any ...
Building boosted classification tree ensemble with genetic programming
GECCO '18: Proceedings of the Genetic and Evolutionary Computation Conference CompanionAdaptive boosting (AdaBoost) is a method for building classification ensemble, which combines multiple classifiers built in an iterative process of reweighting instances. This method proves to be a very effective classification method, therefore it was ...
Classifier ensemble generation and selection with multiple feature representations for classification applications in computer-aided detection and diagnosis on mammography
Novel ensemble classifier framework for improved classification of breast lesions.Ensemble generation algorithm using different types of breast lesion features.Ensemble selection mechanism to find an optimal subset of component classifiers.Impressive ...
Comments