Abstract
Large data sets consisting of a relatively small number of multidimensional feature vectors are often encountered. Such a structure of data sets appears, inter alia, in in the case of genetic data. Various types of classifiers are designed on the basis of such data sets. Small number of multivariate feature vectors are almost always linearly separable. For this reason, the linear classifiers play a fundamental role in the case of small samples of multivariate vectors.
The maximizing margins is one of fundamental principles in classification. The Euclidean (L2) margins is a basic concept in the support vector machines (SVM) method of classifiers designing. An alternative approach to designing classifiers is linked to the local maximization of the margins of the L1 norm. This approach allows also designing complexes of linear classifiers on the basis of small samples of multivariate vectors.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Duda, O.R., Hart, P.E., Stork, D.G.: Pattern Classification. Wiley, New York (2001)
Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)
Bobrowski, L.: Data Exploration and Linear Separability, pp. 1–172. Lambert Academic Publishing (2019)
Bobrowski, L.: Data mining based on convex and piecewise linear criterion functions. Technical University Białystok (2005). (in Polish)
Petersen, K.: Ergodic Theory. Cambridge Studies in Advanced Mathematics. Cambridge University Press, Cambridge (1990)
Bobrowski, L., Łukaszuk, T.: Relaxed linear separability (RLS) approach to feature (Gene) subset selection. In: Xia, X. (ed.) Selected Works in Bioinformatics, pp. 103–118. INTECH (2011)
Tibshirani, R.: Regression shrinkage and selection via the lasso. J. Roy. Stat. Soc. B 58(1), 267–288 (1996)
Bobrowski, L.: Design of piecewise linear classifiers from formal neurons by some basis exchange technique. Pattern Recogn. 24(9), 863–870 (1991)
Simonnard, M.: Linear Programming. Prentice Hall, New York (1966)
Bobrowski, L.: Large matrices inversion using the basis exchange algorithm. Br. J. Math. Comput. Sci. 21(1), 1–11 (2017). http://www.sciencedomain.org/abstract/18203)
Bobrowski, L.: Symmetrical discrimination in pattern recognition - theory, algorithms, and applications in computer aided medical diagnosis, pp. 1–171. Ossolineum, Wroclaw (1987). (in Polish)
Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)
Blachnik, M.: Ensembles of instance selection methods: a comparative study. Int. J. Appl. Math. Comput. Sci. 29(1), 151–168 (2019)
Janicka, M., Lango, M., Stefanowski, J.: Using information on class interrelations to improve classification of multiclass imbalanced data: a new resampling algorithm. Int. J. Appl. Math. Comput. Sci. 29(4), 769–781 (2019)
Acknowledgments
The presented study was supported by the grant WZ/WI-IIT/3/2020 from Bialystok University of Technology and funded from the resources for research by Polish Ministry of Science and Higher Education.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Bobrowski, L. (2021). Complexes of Low Dimensional Linear Classifiers with L1 Margins. In: Nguyen, N.T., Chittayasothorn, S., Niyato, D., Trawiński, B. (eds) Intelligent Information and Database Systems. ACIIDS 2021. Lecture Notes in Computer Science(), vol 12672. Springer, Cham. https://doi.org/10.1007/978-3-030-73280-6_3
Download citation
DOI: https://doi.org/10.1007/978-3-030-73280-6_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-73279-0
Online ISBN: 978-3-030-73280-6
eBook Packages: Computer ScienceComputer Science (R0)