Skip to main content
Log in

OPAL: A new algorithm for optimal partitioning and learning in non parametric unsupervised environments

  • Published:
International Journal of Computer & Information Sciences Aims and scope Submit manuscript

Abstract

A nonlinear programming approach to the problem of learning in nonparametric and unsupervised environments is put forth in this study. The approach exploits the concept of optimal partitioning of the given data set as the objective of such nonparametric unsupervised learning. This partitioning or clustering is achieved by the application of the recently developed improved flexible polyhedron method (IFPM) to the associated optimization problem. The optimization problem is defined here in terms of a new, conceptually innovative optimality criterion based on intergroup and intragroup distinct scatters. Details of this new approach, a procedure for deriving an initial partition, and application of the algorithm to two numerical examples are presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. R. L. Kashyap, “Algorithms for Pattern Classification,” inAdaptive Learning and Pattern Recognition Systems, Fu and Mendel, Eds. (Academic Press, New York, 1970), pp. 81–112.

    Google Scholar 

  2. Y. C. Ho and R. L. Kashyap, “A class of iterative procedures for linear inequalities,” “SIAMJ. Control 4:112–115 (1966).

    Google Scholar 

  3. R. E. Warmack and R. C. Gonzalez, “An algorithm for the optimal solution of linear inequalities and its application to pattern recognition,”IEEE Trans. Comput. C-22:1065–1075 (December 1973).

    Google Scholar 

  4. G. Nagaraja and G. Krishna, “An algorithm for the solution of linear inequalities,”IEEE Trans. Comput. C-23:421–427 (April 1974).

    Google Scholar 

  5. B. V. Dasarathy, “DHARMA:Discriminant hyperplane abstracting residuals minimization algorithm for separating clusters with fuzzy boundaries,”Proc. IEEE 64:823–824 (May 1976).

    Google Scholar 

  6. D. W. Peterson and R. L. Mattson, “A method for finding linear discriminant functions for a class of performance criteria,”IEEE Trans. Inf. Theory IT-12(3):380–387 (July 1966).

    Google Scholar 

  7. K. Fukunaga and W. L. G. Koontz, “A criterion and an algorithm for grouping data,”IEEE Trans. Comput. C-19(10):917–923 (October 1970).

    Google Scholar 

  8. M. R. Anderberg,Cluster Analysis for Application (Academic Press, New York, 1973).

    Google Scholar 

  9. J. B. MacQueen, “Some Methods for Classification and Analysis of Multivariate Observations,”Proceedings of the Fifth Symposium on Mathematical Statistics and Probability (University of California Press, Berkeley, 1967), pp. 281–197.

    Google Scholar 

  10. D. J. McRae, “MIKCA: A Fortran IV iterativeK means cluster analysis program,”Behav. Sci. 16(4):423–426 (1971).

    Google Scholar 

  11. E. W. Forgy, “Cluster analysis of multivariate data: Efficiency vs interpretability of classifications,” Paper presented at Biometric Society Meetings, Riverside, California; abstracted inBiometrics 21(3):768 (1965).

  12. M. M. Astrahan, “Speech Analysis by Clustering or the Hyperphone Method,” Stanford Artificial Intelligence Project Memo AIM-124, AD 70967, Stanford, California.

  13. J. J. Freeman, “Experiments in discrimination and classification,”Pattern Recognition 1:207–218 (1969).

    Google Scholar 

  14. R. G. Casey and G. Nagy, “An autonomous reading machine,”IEEE Trans. Comput. C-17:492–503 (May 1968).

    Google Scholar 

  15. W. D. Fisher, “On grouping for maximum homogenity,”Am. Stat. Assoc. J. 53:789–798 (1958).

    Google Scholar 

  16. P. Ramamoorthy and B. V. Sheela, “Modification of the Flexible Polyhedron Method for Function Minimization,” National Aeronautical Laboratory, NAL AE-TM-4-73, Bangalore, India (1973).

    Google Scholar 

  17. B. V. Sheela, “Development of Optimization Techniques and Their Application to Aeronautical Problems,” Ph.D thesis, Bangalore University, Bangalore, India (December 1976).

    Google Scholar 

  18. J. A. Neider and R. Mead, “A simplex method for function minimization,”Comput. J. 7:308–313 (1965).

    Google Scholar 

  19. B. V. Dasarathy, “A new clustering approach for pattern recognition in unsupervised environments,”J. Indian Inst. Sci. 56(5):202–208 (May 1974).

    Google Scholar 

  20. B. V. Dasarathy, “HINDU: Histogram Inspired Neighborhood Discerning Unsupervised Pattern Recognition System,”Proceedings of the Symposium on Machine Processing of Remotely Sensed Data, Purdue University (June 1975), pp. 2A–43. (Detailed report available from author on request).

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sheela, B.V., Dasarathy, B.V. OPAL: A new algorithm for optimal partitioning and learning in non parametric unsupervised environments. International Journal of Computer and Information Sciences 8, 239–253 (1979). https://doi.org/10.1007/BF00977790

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00977790

Key words