Abstract
In the field of pattern recognition and data learning process the dimensionality reduction is a necessary method for providing a personalized data in perfect numbers. The increase in research proclaims the importance of data reduction in current trends. The old traditional reduction techniques use ranking systems. But this differs from others in reduction of data without lagging its performance in terms of increased efficiency and less percentage of error occurrences. The area of dimensionality reduction not only ends in pattern recognition and also in many high dimensional data processing elements such as text categorization, indexing of documents and mainly in gene expression data. The feature extraction and feature selection are the two steps of process in reduction. The process proposed is statistical pattern recognition process and a paradigm for this approach to the problem is summarized herein. The four data process includes is (1) evaluation (2) acquisition (3) feature selection, and (4) statistical model of feature selection. We show how to reduce the dimensionality by utilizing group and convexity model. This paper concluded by an integrated approach to the implementation dimensionality reduction process with an effort to maximize the efficiency and accuracy.
Similar content being viewed by others
References
Sun, Y., Ridge, C., del Rio, F., Shaka, A.J., Xin, J.: Post processing and sparse blind source separation of positive and partially overlapped data. Signal Process. 91, 1838–1851 (2011)
Hirwani, A., Gonnade, S.: Character recognition using multilayer perceptron. Int. J. Comput. Sci. Inf. Technol. 5(1), 558–661 (2014)
Schmidt, M., Roux, N.L., Bach, F.: Convergence rates of inexact proximal-gradient methods for convex optimization. In: Advances in Neural Information Processing Systems (NIPS) (2011)
Wang, H., Wang, J.: An effective image representation method using kernel classification. In: IEEE 26th International Conference on Tools with Artificial Intelligence (ICTAI), pp. 853–858 (2014)
Abdullah-Al-Mamun, Md, Ahmed, M.: Hypothetical pattern recognition design using multi-layer perceptron neural network for supervised learning. Proc. Int. J. Adv. Res. 4, 1–6 (2015)
Zhang, S., Wang, H., Huang, W.: Two-stage plant species recognition by local mean clustering and weighted sparse representation classification. Clust. Comput. (2017). doi:10.1007/s10586-017-0859-7
Shalev-Shwartz, S., Ben-David, S.: Understanding Machine Learning Theory to Algorithms. Cambridge University Press, Cambridge (2014)
Le Roux, N., Schmidt, M., Bach. F.: A stochastic gradient method with an exponential convergence rate for strongly-convex optimization with finite training sets. In: Advances in Neural Information Processing Systems (NIPS) (2012)
Lee, Y.-T., Sidford, A., Wong, S.C.-W.: A faster cutting plane method and its implications for combinatorial and convex optimization. Math. Oper. Res. (2015)
Geebelen, D., Suykens, J.A.K., Vandewalle, J.: Reducing the number of support vectors of SVM classifiers using the smoothed separable case approximation. IEEE Trans. Neural Netw. Learn. Syst. 23, 682–688 (2012)
Condat, L.: A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms. J. Optim. Theory Appl. 158, 460–479 (2013)
Hirwani, A., Gonnade, S.: Character recognition using multilayer perceptron. Int. J. Comput. Sci. Inf. Technol. 5(1), 558–661 (2014)
Wang, W., Carreira-Perpinan, M.A.: The role of dimensionality reduction in classification. Association for the Advancement of Artificial Intelligence (2014)
Mahoney, M.: Randomized algorithms for matrices and data. Found. Trends Mach. Learn. 3(2), 123–224 (2011)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Li, S., Wang, M., Liu, S. et al. A novel pattern recognition technique based on group clustering computing and convex optimization for dimensionality reduction. Cluster Comput 21, 805–811 (2018). https://doi.org/10.1007/s10586-017-0952-y
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10586-017-0952-y