Abstract
Classifier-independent feature analysis is a classic problem whose solution exhibits exponential growth. In previous research, we developed a new approach to classifier-independent feature analysis based on relative feature importance (rfi), a metric for the relative usefulness of the feature in the optimal subset. Because finding the optimal subset requires exhaustive search, we have also developed an estimator for rfi. The estimator uses adaptive techniques to reduce the computational load. The implementation of both algorithms, direct calculation of rfi and the estimator, on a Connection Machine (CM-5) in CM Fortran is described in this paper. Direct calculation of rfi lends itself naturally to implementation in CM Fortran because the computationally intensive components of the algorithm involve manipulation of large arrays. The adaptive nature of the estimator, however, makes implementing it on the CM-5 more challenging and less efficient.
This research partially supported by a grant from the NRL/ARPA Connection Machine Facility, an effort to rapidly prototype Massively Parallel Processing.
This is a preview of subscription content, log in via an institution.
Preview
Unable to display preview. Download preview PDF.
References
Cover, T.M.: The Best Two Independent Measurements Are Not the Two Best. IEEE Trans. Sys, Man, Cyber, SMC-4 116–117.
Van Campenhout, J.M.: The Arbitrary Relation Between Probability of Error and Measurement Subset. Journal of the American Statistical Association, No. 367, 75 104–109.
Holz, H.J., Loew, M.H.: Relative Feature Importance: A classifier-independent approach to feature selection. Pattern Recognition in Practice IV NY: Elsevier Science Publishers (1994) 473–487
Fukunaga, K., Mantock, J.M.: Nonparametric discriminant analysis. Transactions of the IEEE on Pattern Analysis and Machine Intelligence PAMI-5 (1983) 671–678.
CM Fortran Programming Guide. Cambridge, MA: Thinking Machines Corporation (1993).
Kelly, J.D., Davis, L.: Hybridizing the Genetic Algorithm and the K Nearest Neighbors Classification Algorithm. Proceeding of the 4th International Conference on Genetic Algorithms Belew and Booker, eds., Morgan Kaufman, (1991) 377–383.
Punch, W.F., et. al.: Further Research on Feature Selection and Classification Using Genetic Algorithms. Proceedings of the 5th International Conference on Genetic Algorithms S. Forrest, ed., Morgan Kaufman (1993) 557–564.
Kohonen, T.: Self-Organization and Associative Memory, 2nd Edition. Springer-Verlag, London (1988).
Huang, W.Y., Lippman, R.P.: Neural Net and Traditional Classifiers. Neural Info. Proc. Syst., NY: American Institute of Physics (1988) 387–396.
Blackwell, K.T., Vogl, T.P., Hyman, S.D., Barbour, G.S., Alkon, D.L.: A new approach to hand-written character recognition. Pattern Recognition 25 (1992) 655–665.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1996 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Holz, H.J., Loew, M.H. (1996). Concurrency in feature analysis. In: Dongarra, J., Madsen, K., Waśniewski, J. (eds) Applied Parallel Computing Computations in Physics, Chemistry and Engineering Science. PARA 1995. Lecture Notes in Computer Science, vol 1041. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-60902-4_34
Download citation
DOI: https://doi.org/10.1007/3-540-60902-4_34
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-60902-5
Online ISBN: 978-3-540-49670-0
eBook Packages: Springer Book Archive