ABSTRACT
Cost-sensitive learning has been widely used to address the problem of class imbalance. However, cost matrices are often manually designed. In many real-world applications, cost values are often unknown because of the limited domain knowledge. This paper proposes a new genetic programming method to construct cost-sensitive classifiers, which do not require the manually designed cost values. The experimental results show that the proposed method often outperforms existing GP methods.
- Urvesh Bhowan, Mark Johnston, and Mengjie Zhang. 2012. Developing new fitness functions in genetic programming for classification with unbalanced data. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 42, 2 (2012), 406--421.Google ScholarDigital Library
- Charles Elkan. 2001. The foundations of cost-sensitive learning. In International joint conference on artificial intelligence, Vol. 17. Lawrence Erlbaum Associates Ltd, 973--978.Google ScholarDigital Library
Index Terms
- A genetic programming method for classifier construction and cost learning in high-dimensional unbalanced classification
Recommendations
Genetic programming for development of cost-sensitive classifiers for binary high-dimensional unbalanced classification
AbstractGenetic programming (GP) has the built-in ability for feature selection when developing classifiers for classification with high-dimensional data. However, due to the problem of class imbalance, the developed classifiers by GP are ...
Highlights- This paper investigates the use of cost-sensitive learning with genetic programming (GP) when the cost matrix is unknown.
Comparing ensemble learning approaches in genetic programming for classification with unbalanced data
GECCO '13 Companion: Proceedings of the 15th annual conference companion on Genetic and evolutionary computationThis paper compares three approaches to evolving ensembles in Genetic Programming (GP) for binary classification with unbalanced data. The first uses bagging with sampling, while the other two use Pareto-based multi-objective GP (MOGP) for the trade-off ...
Building boosted classification tree ensemble with genetic programming
GECCO '18: Proceedings of the Genetic and Evolutionary Computation Conference CompanionAdaptive boosting (AdaBoost) is a method for building classification ensemble, which combines multiple classifiers built in an iterative process of reweighting instances. This method proves to be a very effective classification method, therefore it was ...
Comments