Skip to main content
Log in

K-harmonic means clustering algorithm using feature weighting for color image segmentation

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

This paper mainly proposes K-harmonic means (KHM) clustering algorithms using feature weighting for color image segmentation. In view of the contribution of features to clustering, feature weights which can be updated automatically during the clustering procedure are introduced to calculate the distance between each pair of data points, hence the improved versions of KHM and fuzzy KHM are proposed. Furthermore, the Lab color space, local homogeneity and texture are utilized to establish the feature vector to be more applicable for color image segmentation. The feature group weighting strategy is introduced to identify the importance of different types of features. Experimental results demonstrate the proposed feature group weighted KHM-type algorithms can achieve better segmentation performances, and they can effectively distinguish the importance of different features to clustering.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Ahmed MN, Yamany SM, Mohamed N et al (2002) A modified fuzzy c-means algorithm for bias field estimation and segmentation of MRI data. IEEE Trans Med Imaging 21(3):193–199

    Article  Google Scholar 

  2. Alguwaizani A, Hansen P, Mladenovic N et al (2011) Variable neighborhood search for harmonic means clustering. Appl Math Modell 35(6):2688–2694

    Article  MATH  Google Scholar 

  3. Arbelaez P, Maire M, Fowlkes CC et al (2011) Contour detection and hierarchical image segmentation. IEEE Trans Pattern Anal Mach Intell 33(5):898–916

    Article  Google Scholar 

  4. Cai W, Chen S, Zhang D (2007) Fast and robust fuzzy c-means clustering algorithms incorporating local information for image segmentation. Pattern Recogn 40(3):825–838

    Article  MATH  Google Scholar 

  5. Carrizosa E, Alguwaizani A, Hansen P et al (2015) New heuristic for harmonic means clustering. J Glob Optim 63(3):427–443

    Article  MathSciNet  MATH  Google Scholar 

  6. Chen X, Ye Y, Xu X et al (2012) A feature group weighting method for subspace clustering of high-dimensional data. Pattern Recogn 45(1):434–446

    Article  MATH  Google Scholar 

  7. Cheng H-D, Sun Y (2000) A hierarchical approach to color image segmentation using homogeneity. IEEE Trans Image Process 9(12):2071–2082

    Article  MathSciNet  Google Scholar 

  8. Gan G, Ng MK-P (2015) Subspace clustering with automatic feature grouping. Pattern Recogn 48(11):3703–3713

    Article  Google Scholar 

  9. Gong M, Liang Y, Shi J et al (2013) Fuzzy c-means clustering with local information and kernel metric for image segmentation. IEEE Trans Image Process 22(2):573–584

    Article  MathSciNet  MATH  Google Scholar 

  10. Hung C-H, Chiou H-M, Yang W-N (2013) Candidate groups search for K-harmonic means data clustering. Appl Math Modell 37(24):10123–10128

    Article  MathSciNet  Google Scholar 

  11. Huang JZ, Ng MK, Rong H et al (2005) Automated variable weighting in k-means type clustering. IEEE Trans Pattern Anal Mach Intell 27(5):657–668

    Article  Google Scholar 

  12. Huang X, Ye Y, Zhang H (2014) Extensions of kmeans-type algorithms: a new clustering framework by integrating intracluster compactness and intercluster separation. IEEE Trans Neural Netw Learn Syst 25(8):1433–1446

    Article  Google Scholar 

  13. Jiang H, Yi S, Li J et al (2010) Ant clustering algorithm with K-harmonic means clustering. Expert Syst Appl 37(12):8679–8684

    Article  Google Scholar 

  14. Kumar S, Pant M, Kumar M et al (2015) Colour image segmentation with histogram and homogeneity histogram difference using evolutionary algorithms. Int J Mach Learn Cybern 6:1–21

    Article  Google Scholar 

  15. Li Q, Mitianoudis N, Stathaki T (2007) Spatial kernel K-harmonic means clustering for multi-spectral image segmentation. IET Image Process 1(2):156–167

    Article  Google Scholar 

  16. Liu N, Chen F, Lu M (2013) Spectral co-clustering documents and words using fuzzy K-harmonic means. Int J Mach Learn Cybern 4(1):75–83

    Article  Google Scholar 

  17. Sag T, Cunkas M (2015) Color image segmentation based on multiobjective artificial bee colony optimization. Appl Soft Comput 34:389–401

    Article  Google Scholar 

  18. Tan KS, Isa NA, Lim WH et al (2013) Color image segmentation using adaptive unsupervised clustering approach. Appl Soft Comput 13(4):2017–2036

    Article  Google Scholar 

  19. Wu X, Wu B, Sun J et al (2015) A hybrid fuzzy K-harmonic means clustering algorithm. Appl Math Modell 39(12):3398–3409

    Article  Google Scholar 

  20. Xing HJ, Ha MH (2014) Further improvements in Feature-Weighted Fuzzy C-Means. Inf Sci 267:1–15

    Article  MathSciNet  MATH  Google Scholar 

  21. Yang FQ, Sun TEL, Zhang CH (2009) An efficient hybrid data clustering method based on K-harmonic means and Particle Swarm Optimization. Expert Syst Appl 36(6):9847–9852

    Article  Google Scholar 

  22. Yeh W-C, Lai C-M, Chang K-H (2016) A novel hybrid clustering approach based on K-harmonic means using robust design. Neurocomputing 173:1720–1732

    Article  Google Scholar 

  23. Zhang B (2000) Generalized k-harmonic means. Hewlett-Packard Laboratoris Technical Report

  24. Zhou Z, Zhu S, Zhang D (2015) A novel K-harmonic means clustering based on enhanced firefly algorithm. In: Proceedings of the the 6th international conference on intelligence science and big data engineering. Springer, Suchow, pp 140–149

Download references

Acknowledgements

This research is supported by the National Natural Science Foundation of China (Grant No. 61373126) and the Fundamental Research Funds for the Central Universities of China (Grant No. JUSRP51510).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhiping Zhou.

Appendices

Appendix A

In this Appendix, the detailed derivations for obtaining the update equation of c j and w q are provided. At first, the partial derivation by c j of L is calculated as follows.

$$ \frac{{\partial L}}{{\partial {{\boldsymbol{c}}_{j}}}} = - kp\sum\limits_{i = 1}^{n} {\frac{{{{\left[d_{ij}^{({\boldsymbol{w}})}\right]}^{-p-2}}{\text{diag}}({{\boldsymbol{w}}^{2}})({{\boldsymbol{x}}_{i}} - {{\boldsymbol{c}}_{j}})}} {{{{\left( {\sum\nolimits_{j = 1}^{k} {{{\left[d_{ij}^{({\boldsymbol{w}})}\right]}^{- p}}}} \right)}^{2}}}}} $$
(A.1)

As we can see from the above equation, diag(w 2) is not related to variable i and \(\frac {{{{\left [d_{ij}^{({\boldsymbol {w}})}\right ]}^{- p - 2}}}}{{{{\left ({\sum \nolimits _{j = 1}^{K} {{{\left [d_{ij}^{({\boldsymbol {w}})}\right ]}^{- p}}}} \right )}^{2}}}} {{ \;=\;} }{m_{WKHM}}({{{{\boldsymbol {c}}_{j}}} \left / {{{\boldsymbol {x}}_{i}}}\right .}) \cdot {w_{WKHM}}({{\boldsymbol {x}}_{i}})\), thus the equation of c j is obtained as (2) by letting (A.1) be equal to 0. But it should be noted that the Euclidian distance d i j is replaced by \(d_{ij}^{(\boldsymbol {w})}\) in m W K H M (c j /x i ) and w W K H M (x i ).

$$ {m_{WKHM}}({{{{\boldsymbol{c}}_{j}}} \left/ {{{\boldsymbol{x}}_{i}}}\right.}) = \frac{{{{\left[d_{ij}^{({\boldsymbol{w}})}\right]}^{- p - 2}}}}{{\sum\nolimits_{j = 1}^{k} {{{\left[d_{ij}^{({\boldsymbol{w}})}\right]}^{- p - 2}}}} } $$
(A.2)
$$ {w_{WKHM}}({{\boldsymbol{x}}_{i}}) = \frac{{\sum\nolimits_{j = 1}^{k} {{{\left[d_{ij}^{({\boldsymbol{w}})}\right]}^{- p - 2}}}} }{{{{\left( \sum\nolimits_{j = 1}^{k} {{{\left[d_{ij}^{({\boldsymbol{w}})}\right]}^{- p}}} \right)}^{2}}}} $$
(A.3)

Then, letting the partial deviation by w q of L, which is denoted as (A.4), to be equal to 0, hence the update equation of w q is obtained as (A.5), where the Lagrange multiplier λ should be eliminated.

$$ \frac{{\partial L}}{{\partial{w_{q}}}}=kp{w_{q}}\sum\limits_{i=1}^{n} {\frac{{\sum\nolimits_{j=1}^{k} {\left( {{{\left[d_{ij}^{({\boldsymbol{w}})}\right]}^{-p-2}}\cdot {{({x_{iq}}-{c_{jq}})}^{2}}} \right)}}}{{{{\left( {\sum\nolimits_{j = 1}^{k} {{{\left[d_{ij}^{({\boldsymbol{w}})}\right]}^{- p}}}} \right)}^{2}}}}}-\lambda $$
(A.4)
$$ {w_{q}} = \frac{\lambda}{{kp\sum\limits_{i=1}^{n} {\frac{{\sum\nolimits_{j=1}^{k} {\left( {{{\left[d_{ij}^{({\boldsymbol{w}})}\right]}^{-p-2}}\cdot{{({x_{iq}}-{c_{jq}})}^{2}}}\right)}}}{{{{\left( {\sum\nolimits_{j=1}^{k} {{{\left[d_{ij}^{({\boldsymbol{w}})}\right]}^{-p}}}} \right)}^{2}}}}}} } $$
(A.5)

In terms of the constraint of feature weights \({w_{q}} \in [0,1],\;\sum \limits _{q = 1}^{d} {{w_{q}} = 1}\), in which the (A.5) is substituted and the calculation of λ is obtained as follows.

$$ \lambda = \sum\limits_{l = 1}^{d} {kp\sum\limits_{i = 1}^{n} {\frac{{\sum\nolimits_{j = 1}^{k} {\left( {\left[d_{ij}^{({\boldsymbol{w}})} \right]^{- p - 2} \cdot (x_{iq} - c_{jq} )^{2}} \right)}} }{{\left( {\sum\nolimits_{j = 1}^{k} {\left[d_{ij}^{({\boldsymbol{w}})} \right]^{- p}} } \right)^{2}} }}} $$
(A.6)

Therefore, the (A.6) is substituted in (A.5) to obtain the update equation of w q (q = 1, 2, …, d) shown as (9).

Appendix B

In this Appendix, the detailed derivation for obtaining the update equation (20) is provided. First, the partial derivation by c j (j = 1, 2, …, K) of L 2 is calculated and the result is set to be 0, then the update equation of cluster centers can also be obtained with the same form as (2), where \(d_{ij}^{(g\boldsymbol {w})}\) is utilized in m W K H M (c j /x i ) and w W K H M (x i ). For the computation of feature weights, we firstly analyze the case of qG(1), the partial deviation by w q of L 2 is calculated as follows.

$$ \frac{{\partial {L_{2}}}}{{\partial {w_{q}}}} = kp{w_{q}}\sum\limits_{i = 1}^{n} {\frac{{{v_{1}}\sum\nolimits_{j = 1}^{k} {\left( {{{\left[d_{ij}^{(g{\boldsymbol{w}})}\right]}^{- p - 2}} \cdot {{({x_{iq}} - {c_{jq}})}^{2}}} \right)}} } {{{{\left( {\sum\nolimits_{j = 1}^{k} {{{\left[d_{ij}^{(g{\boldsymbol{w}})}\right]}^{-p}}}} \right)}^{2}}}}}-{\lambda_{1}} $$
(B.1)

Then, letting the value of (B.1) to be 0 and the equation of w q is obtained as (B.2), which is substituted in \(\sum \limits _{q \in G(1)} {{w_{q}} = 1}\), the constraint of feature group G(1), then the calculation of λ 1 is obtained and substituted in (B.2) again, therefore the update equation of feature weights of G(1) is shown as (20).

$$ {w_{q}} = \frac{{{\lambda_{1}}}}{{kp{v_{1}}\sum\limits_{i = 1}^{n} {\frac{{\sum\nolimits_{j = 1}^{k} {\left( {{{\left[d_{ij}^{(g{\boldsymbol{w}})}\right]}^{- p - 2}} \cdot {{({x_{iq}} - {c_{jq}})}^{2}}} \right)}} }{{{{\left( {\sum\nolimits_{j = 1}^{k} {{{\left[d_{ij}^{(g{\boldsymbol{w}})}\right]}^{- p}}}} \right)}^{2}}}}}} } $$
(B.2)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, Z., Zhao, X. & Zhu, S. K-harmonic means clustering algorithm using feature weighting for color image segmentation. Multimed Tools Appl 77, 15139–15160 (2018). https://doi.org/10.1007/s11042-017-5096-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-017-5096-9

Keywords

Navigation