Abstract
Semi-supervised learning (SSL) is useful when few labeled and plenty of unlabeled examples are available. This occurs in most of the cases due to labeled instances be difficult, expensive and time consuming to be obtained since human experts are required for the labeling task (Chapelle, Schlkopf, & Zien, 2010). The training set contain labeled data represented by XL = {(x1, y1) ... (x1, y1)}, and unlabelled data represented as XU = {xl+1 ... xl+u}. The total amount of training data is X = l ∪ u. The set of labeled examples is associated with the labels LY = {y1, ..., yl} where yi ε {1,..., c} and c is the number of classes. The purpose of SSL is to infer the missing labels YU = {yl+1,..., yn} corresponding to the unlabelled set XU.
- Berton, L., & de Andrade Lopes, A. (2014). Graph construction based on labeled instances for semi-supervised learning. In 22nd international conference on pattern recognition (pp. 2477--2482). doi: 10.1109/ICPR.2014.428 Google ScholarDigital Library
- Berton, L., Valverde-Rebaza, J., & de Andrade Lopes, A. (2015). Link prediction in graph construction for supervised and semi-supervised learning. In International joint conference on neural networks (pp. 1--8).Google Scholar
- Chapelle, O., Schlkopf, B., & Zien, A. (2010). Semi-supervised learning (1st ed.). The MIT Press. Google ScholarDigital Library
- Jebara, T., Wang, J., & Chang, S.-F. (2009). Graph construction and b-matching for semi-supervised learning. In 26th annual international conference on machine learning (pp. 441--448). Google ScholarDigital Library
- Vega-Oliveros, D. A., Berton, L., Eberle, A., Lopes, A. d. A., & Zhao, L. (2014). Regular graph construction for semi-supervised learning. In Journal of physics: Conference series (Vol. 490, pp. 012022-1-012022-4).Google Scholar
Recommendations
Inductive Semi-supervised Multi-Label Learning with Co-Training
KDD '17: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data MiningIn multi-label learning, each training example is associated with multiple class labels and the task is to learn a mapping from the feature space to the power set of label space. It is generally demanding and time-consuming to obtain labels for training ...
Instance selection method for improving graph-based semi-supervised learning
PRICAI'16: Proceedings of the 14th Pacific Rim International Conference on Trends in Artificial IntelligenceGraph-based semi-supervised learning (GSSL) is one of the most important semi-supervised learning (SSL) paradigms. Though GSSL methods are helpful in many situations, they may hurt performance when using unlabeled data. In this paper, we propose a new ...
Multiview Semi-Supervised Learning with Consensus
Obtaining high-quality and up-to-date labeled data can be difficult in many real-world machine learning applications. Semi-supervised learning aims to improve the performance of a classifier trained with limited number of labeled data by utilizing the ...
Comments