Abstract
Hyperspectral imagery is a powerful source of information for recognition problems in a variety of fields. However, the resulting data volume is a challenge for classification methods especially considering industrial context requirements. Support Vector Machines (SVMs), commonly used classifiers for hyperspectral data, are originally suited for binary problems. Basing our study on [12] bbas allocation for binary classifiers, we investigate different strategies to combine two-class SVMs and tackle the multiclass problem. We evaluate the use of belief functions regarding the matter of SVM fusion with hyperspectral data for a waste sorting industrial application. We specifically highlight two possible ways of building a fast multi-class classifier using the belief functions framework that takes into account the process uncertainties and can use different information sources such as complementary spectra features.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Cavalli, R., Licciardi, G., Chanussot, J.: Archaeological structures using nonlinear principal component analysis applied to airborne hyperspectral image. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 6(2), 659–669 (2013)
Chen, G., Qian, S.-E.: Denoising of hyperspectral imagery using principal component analysis and wavelet shrinkage. IEEE Trans. Geosci. Remote Sens. 49(3), 973–980 (2011)
Denoeux, T.: Analysis of evidence-theoretic decision rules for pattern classification. Pattern Recogn. 30(7), 1095–1107 (1997)
Denœux, T.: Likelihood-based belief function: justification and some extensions to low-quality data. Int. J. Approx. Reason. 55(7), 1535–1547 (2014)
Grahn, H., Geladi, P.: Techniques and Applications of Hyperspectral Image Analysis. Wiley, New York (2007)
King, R.L., Ruffin, C., LaMastus, F., Shaw, D.: The analysis of hyperspectral data using Savitzky-Golay filtering-practical issues. 2. In: Proceedings of IGARSS 1999, vol. 1, pp. 398–400. IEEE (1999)
Kuo, B.-C., Ho, H.-H., Li, C.-H., Hung, C.-C., Taur, J.-S.: A kernel-based feature selection method for SVM with RBF kernel for hyperspectral image classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 7(1), 317–326 (2014)
Melgani, F., Bruzzone, L.: Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans. Geosci. Remote Sens. 42(8), 1778–1790 (2004)
Quost, B., Denœux, T., Masson, M.-H.: Pairwise classifier combination using belief functions. Pattern Recogn. Lett. 28(5), 644–653 (2007)
Samiappan, S., Prasad, S., Bruce, L.M.: Non-uniform random feature selection, kernel density scoring with SVM based ensemble classification for hyperspectral image analysis. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 6(2), 792–800 (2013)
Vaiphasa, C.: Consideration of smoothing techniques for hyperspectral remote sensing. ISPRS J. Photogram. Remote Sens. 60(2), 91–99 (2006)
Xu, P., Davoine, F., Zha, H., Denoeux, T.: Evidential calibration of binary SVM classifiers. Int. J. Approx. Reason. 72, 55–70 (2016)
Yang, J.-H., Wang, L.-G., Qian, J.-X.: Hyperspectral image classification based on spatial and spectral features and sparse representation. Appl. Geophys. 11(4), 489–499 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix: Evidential Calibration
Appendix: Evidential Calibration
Handling binary classifiers, the discernment frame is \(\Omega ^{b}= \left\{ \left\{ 0\right\} ,\left\{ 1\right\} \right\} \). Then, for a given SVM having its own features in terms of number of samples, learning step performance, we aim at defining a belief function for each score that reflects the confidence we may have in each class. Indeed, this belief function will be used for forecasting taking into account the whole training set specificities. Explicitly, for each score s, the mass function, denoted m, is derived from the contour function: \(\omega \rightarrow pl_X\left( \omega ,s\right) \), where \(\omega =P\left( y=1\left| s\right. \right) \) (note that, \(\omega \) is not a class but a probability).
To build the contour function on \(\omega \), [12] uses the idea behind the logistic regression: approximating the probability \(P\left( y=1\left| s\right. \right) \) by a sigmoid \(sig_s\left( \theta \right) =\left[ 1+exp(\theta _0+\theta _1 s)\right] ^{-1}\), where the parameter \(\hat{\theta }=\left( \theta _0, \theta _1\right) \) is determined by maximizing the likelihood function \(L_X\left( \theta \right) \) over the training set \(X=\{(s_1,y_1), \dots , (s_N,y_N) \}\) where, for each sample number i, \(s_i\in \mathbb {R}\) is the score given by the considered classifier and \(y_i\in \left\{ 0,1\right\} \) is its true label. Then, the contour function of interest is drawn for a given value of score. It derives from the 2D function plotting the relative value of the likelihood function \(\frac{L_X(\theta )}{L_X(\hat{\theta })}\) versus \(\theta =(\theta _0,\theta _1)\). Then, for any given pair \((s,\omega )\), the set of \(\theta \) (i.e. \(sig_s^{-1}(\omega )\)) values is a straight line in \(\mathbb {R}^2\). Then, the contour function value can be determined as the maximum value over this straight line:
with \(L_X\left( \theta \right) =\prod _{i=1}^N p_i^{y_i} (1-p_i)^{1-y_i}\) where \(p_i=\frac{1}{1+exp\left( \theta _0+\theta _1 s_i\right) }\). Finally, from each \(pl^{\Omega _{j,k}^{b}}_X\), the corresponding mass function \(m^{\Omega _{j,k}^{b}}\) on binary discernment frame \(\Omega _{j,k}^{b}\) is derived using the ‘likelihood based’ belief function for statistical inference approach proposed by Shafer and further justified by Denœux [4].
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Lachaize, M., Le Hégarat-Mascle, S., Aldea, E., Maitrot, A., Reynaud, R. (2016). SVM Classifier Fusion Using Belief Functions: Application to Hyperspectral Data Classification. In: Vejnarová, J., Kratochvíl, V. (eds) Belief Functions: Theory and Applications. BELIEF 2016. Lecture Notes in Computer Science(), vol 9861. Springer, Cham. https://doi.org/10.1007/978-3-319-45559-4_12
Download citation
DOI: https://doi.org/10.1007/978-3-319-45559-4_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-45558-7
Online ISBN: 978-3-319-45559-4
eBook Packages: Computer ScienceComputer Science (R0)