Skip to main content
Log in

Fuzzy entropy functions based on perceived uncertainty

  • Regular Paper
  • Published:
Knowledge and Information Systems Aims and scope Submit manuscript

Abstract

The paper presents fuzzy entropy functions based on perceived vagueness. The proposed entropy functions are based on the principle that different agents may perceive a membership grade differently. The perceived uncertainty for a membership grade is determined through a gain function. In this light, new variants of the popular fuzzy entropy functions are developed. Inspired by these variants, a novel fuzzy entropy function is also developed. The proposed functions are extended to the probabilistic fuzzy domain. A case study is included to illustrate applicability of the work.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. It is interesting to note that this is missing from the conventional Luca and Termini entropy. As a result, the more the number of elements considered, the more becomes the entropy. Hence, the entropy increases, if we consider a refined definition of a fuzzy set, which is obviously counter-intuitive.

References

  1. Aggarwal M (2019) Decision aiding model with entropy-based subjective utility. Inf Sci 501:558–572

    Article  MathSciNet  Google Scholar 

  2. Aggarwal M (2020) Bridging the gap between probabilistic and fuzzy entropy. IEEE Trans Fuzzy Syst 28(9):2175–2184

    Article  Google Scholar 

  3. Aggarwal M (2021) Attitude-based entropy function and applications in decision-making. Eng Appl Artif Intell 104:104290

    Article  Google Scholar 

  4. Aggarwal M, Hanmandlu M (2016) Representing uncertainty with information sets. IEEE Trans Fuzzy Syst 24(1):1–15

    Article  Google Scholar 

  5. Allahverdyan AE, Galstyan A, Abbas AE, Struzik ZR (2018) Adaptive decision making via entropy minimization. Int J Approx Reason 103:270–287

    Article  MathSciNet  MATH  Google Scholar 

  6. De Luca A, Termini S (1972) A definition of a nonprobabilistic entropy in the setting of fuzzy sets theory. Inf Control 20(4):301–312

    Article  MathSciNet  MATH  Google Scholar 

  7. Dujet C (1983) Separation functions and measures of fuzziness. In: IFAC proceedings volumes. IFAC symposium on fuzzy information, knowledge representation and decision analysis, Marseille, France, vol 16, no 13, pp 91–96

  8. Ebanks BR (1983) On measures of fuzziness and their representations. J Math Anal Appl 94(1):24–37

    Article  MathSciNet  MATH  Google Scholar 

  9. Emptoz H (1981) Nonprobabilistic entropies and indetermination measures in the setting of fuzzy sets theory. Fuzzy Sets Syst 5(3):307–317

    Article  MathSciNet  MATH  Google Scholar 

  10. Fan J, Weixin Xie (1999) Distance measure and induced fuzzy entropy. Fuzzy Sets Syst 104(2):305–314

    Article  MathSciNet  MATH  Google Scholar 

  11. Gao C, Lai Z, Zhou J, Wen J, Wong WK (2019) Granular maximum decision entropy-based monotonic uncertainty measure for attribute reduction. Int J Approx Reason 104:9–24

    Article  MathSciNet  MATH  Google Scholar 

  12. Gou XJ, Xu ZS, Liao HC (2017) Hesitant fuzzy linguistic entropy and cross-entropy measures and alternative queuing method for multiple criteria decision making. Inf Sci 388–389:225–246

    Article  Google Scholar 

  13. Hirota K, Pedrycz W (1986) Subjective entropy of probabilistic sets and fuzzy cluster analysis. IEEE Trans Syst Man Cybern 16(1):173–179

    Article  MATH  Google Scholar 

  14. Kaufmann A (1980) Introduction to the theory of fuzzy subsets: fundamental theoretical elements. Academic Press, New York

    Google Scholar 

  15. Knopfmacher J (1975) On measures of fuzziness. J Math Anal Appl 49(3):529–534

    Article  MathSciNet  MATH  Google Scholar 

  16. Kosko B (1986) Fuzzy entropy and conditioning. Inf Sci 40(2):165–174

    Article  MathSciNet  MATH  Google Scholar 

  17. Lee HM, Chen CM, Chen JM, Jou YL (2001) An efficient fuzzy classifier with feature selection based on fuzzy entropy. IEEE Trans Syst Man Cybern Part B (Cybern) 31(3):426–432

    Article  Google Scholar 

  18. Liu XC (1992) Entropy, distance measure and similarity measure of fuzzy sets and their relations. Fuzzy Sets Syst 52(3):305–318

    Article  MathSciNet  MATH  Google Scholar 

  19. Loo SG (1977) Measures of fuzziness. Cybernetics 20:201–210

    MATH  Google Scholar 

  20. Pal NR, Bezdek JC (1994) Measuring fuzzy uncertainty. IEEE Trans Fuzzy Syst 2(2):107–118

    Article  Google Scholar 

  21. Pal NR, Pal SK (1989) Object-background segmentation using new definitions of entropy. IEEE Proc Comput Digit Tech 136(4):284–295

    Article  Google Scholar 

  22. Pal NR, Pal SK (1991) Entropy: a new definition and its applications. IEEE Trans Syst Man Cybern 21(5):1260–1270

    Article  MathSciNet  MATH  Google Scholar 

  23. Raghu S, Sriraam N, Kumar GP, Hegde AS (2018) A novel approach for real-time recognition of epileptic seizures using minimum variance modified fuzzy entropy. IEEE Trans Biomed Eng 65(11):2612–2621

    Article  Google Scholar 

  24. Romero-Troncoso RJ, Saucedo-Gallaga R, Cabal-Yepez E, Garcia-Perez A, Osornio-Rios RA, Alvarez-Salas R, Miranda-Vidales H, Huber N (2011) FPGA-based online detection of multiple combined faults in induction motors through information entropy and fuzzy inference. IEEE Trans Ind Electron 58(11):5263–5270

    Article  Google Scholar 

  25. Sander W (1989) On measures of fuzziness. Fuzzy Sets Syst 29(1):49–55

    Article  MathSciNet  MATH  Google Scholar 

  26. Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27(4):623–666

    Article  MathSciNet  MATH  Google Scholar 

  27. Singh P (2020) A neutrosophic-entropy based adaptive thresholding segmentation algorithm: a special application in MR images of Parkinson’s disease. Artif Intell Med 104:101838

    Article  Google Scholar 

  28. Singh P (2020) A neutrosophic-entropy based clustering algorithm (NEBCA) with HSV color system: a special application in segmentation of Parkinson’s disease (PD) MR images. Comput Methods Progr Biomed 189:105317

    Article  Google Scholar 

  29. Singh P, Borah B (2013) High-order fuzzy-neuro expert system for time series forecasting. Knowl Based Syst 46:12–21

    Article  Google Scholar 

  30. Singh P, Dhiman G (2018) Uncertainty representation using fuzzy-entropy approach: special application in remotely sensed high-resolution satellite images (RSHRSIs). Appl Soft Comput 72:121–139

    Article  Google Scholar 

  31. Singh P, Huang Y-P, Lee T-T (2019) A novel ambiguous set theory to represent uncertainty and its application to brain MR image segmentation. In: 2019 IEEE international conference on systems, man and cybernetics (SMC), pp 2460–2465

  32. Trillas E, Riera T (1978) Entropies in finite fuzzy sets. Inf Sci 15(2):159–168

    Article  MathSciNet  MATH  Google Scholar 

  33. Wang X, Dong C (2009) Improving generalization of fuzzy if–then rules by maximizing fuzzy entropy. IEEE Trans Fuzzy Syst 17(3):556–567

    Article  Google Scholar 

  34. Wang ZX (1984) Fuzzy measures and measures of fuzziness. J Math Anal Appl 104(2):589–601

    Article  MathSciNet  MATH  Google Scholar 

  35. Weber S (1984) Measures of fuzzy sets and measures of fuzziness. Fuzzy Sets Syst 13(3):247–271

    Article  MathSciNet  MATH  Google Scholar 

  36. Xie WX, Bedrosian SD (1984) An information measure for fuzzy sets. IEEE Trans Syst Man Cybern SMC 14(1):151–156

    Article  MathSciNet  MATH  Google Scholar 

  37. Yang M, Nataliani Y (2018) A feature-reduction fuzzy clustering algorithm based on feature-weighted entropy. IEEE Trans Fuzzy Syst 26(2):817–835

    Article  Google Scholar 

  38. Yue C (2017) Entropy-based weights on decision makers in group decision-making setting with hybrid preference representations. Appl Soft Comput 60:737–749

    Article  Google Scholar 

  39. Zadeh LA (1968) Fuzzy sets and applications, selected papers by L. A. Zadeh, chapter probability measures of fuzzy events, pp 45–51. Wiley

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Manish Aggarwal.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix A. Proofs for the properties of P-Sh entropy

The basic properties are investigated with \(\gamma \) as 1.

Proof of Property 1

From (7), we know that \(H_{\text {P-Sh}, d} = \frac{1}{n}\sum _{i = 1}^{n} (1 - \Delta \mu _i){\mathcal {G}}_i\) . Since \(\mu _i \in [0, 1]\), \(\Delta \mu _i \in [0, 1]\) [see (3)]. Hence \((1 - \Delta \mu _i) \ge 0\), and \({\mathcal {G}}_i = -\log (\Delta \mu _i) \ge 0\). Since both \((1 - \Delta \mu _i) \ge 0\), and \({\mathcal {G}}_i \ge 0\), \(H_{\text {P-Sh}, d} \ge 0\).

Proof of Property 2

The proof follows trivially. When \(\forall x_i \in X = 0\) or 1, \(\Delta \mu _i = 1\). Therefore, \(\log \left( \Delta \mu _i\right) = 0\), \(\forall i\), and hence, \(H_{\text {P-Sh}, d} = 0\).

Proof of Property 3

At \(\mu _i = 0.5\), \(\Delta \mu _i = 0\), i.e. the minimum. As \(\mu _i\) moves towards 0.5, \(\Delta \mu _i\) decreases. As \(\Delta \mu _i\) decreases, \({\mathcal {G}}_i\) increases. Therefore, the maximum value for \({\mathcal {G}}_i\) is obtained at \(\mu _i = 0.5\). As \(\log \) is undefined at 0, we consider \(\mu _i \rightarrow 0.5\), or \(\Delta \mu _i \rightarrow 0\). Similarly, the weight \((1 - \Delta \mu _i)\) attains its maximum value at \(\mu _i = 0.5\). Therefore, \(H_{\text {P-Sh}, d}\) is maximum at \(\mu _i \rightarrow 0.5, \forall x_i \in U\).

Proof of Property 4

\(H[X,Y] = H[X] + H[Y]\). Let \(\Delta {\mathcal {M}}^X\) and \(\Delta {\mathcal {M}}^Y\) denote the sets for relative membership grades for the fuzzy sets X and Y defined on U. Then from (6), we can write:

$$\begin{aligned} H[X]&= -\frac{1}{n}\sum _{i = 1}^{n} (1 - \Delta \mu _i^X)\log ({\Delta \mu _i^X}) \nonumber \\ H[Y]&= -\frac{1}{n}\sum _{i = 1}^{n} (1 - \Delta \mu _i^Y)\log ({\Delta \mu _i}^Y) \end{aligned}$$
(A.1)

Since \(\Delta \mu _i^X\) and \(\Delta \mu _i^Y\) are independent of each other, we can write:

$$\begin{aligned} H^{X, Y}&= -\frac{1}{n}\sum _{i = 1}^{n} \left( (1 - \Delta \mu _i^X)\log ({\Delta \mu _i^X}) + (1 - \Delta \mu _i^Y)\log ({\Delta \mu _i^Y})\right) \nonumber \\&= -\frac{1}{n}\sum _{i = 1}^{n} (1 - \Delta \mu _i^X)\log ({\Delta \mu _i^X}) - \sum _{i = 1}^{n} (1 - \Delta \mu _i^Y)\log ({\Delta \mu _i}^Y) \nonumber \\&= H[X] + H[Y] \end{aligned}$$
(A.2)

Proof of Property 5

The proof follows straightforward from proofs of Properties 3 and 4.

Proof of Property 6

From (3) : \(\Delta \mu _i = 2*|0.5 - \mu _i|\). For a complement set \(\mu _i = 1 - \mu _i, \, \forall i\). Hence, \(\Delta \mu _i\) for the complement set: \(\Delta \mu _i^{'} = 2*|0.5 - (1 - \mu _i)| = \Delta \mu _i\). Therefore, \(H = H^{'}\).

Appendix B. Proofs for the properties of P-PP fuzzy entropy

The basic properties are investigated taking \(\gamma \) as 1.

Proof of Property 1

We know that the entropy is minimum when \(\mu _i = 0\) or 1, or \(\Delta \mu _i = 1\), \(\forall x_i \in X\). Recalling (9): \(H = \frac{1}{n}\sum _{i = 1}^n (1 - \Delta \mu _i)e^{1 - \Delta \mu _i}\), we know that H is minimum when both \((1 - \Delta \mu _i)\) and \(e^{1 - \Delta \mu _i}\) are at their respective minimum values, for all \(x_i \in X\). Since \(\mu _i \in [0, 1]\), we have \(\Delta \mu _i \in [0, 1]\). At the upper bound \(\Delta \mu _i = 1\), or \(\mu _i = 0\) or 1, both \((1 - \Delta \mu _i)\) and \(e^{1 - \Delta \mu _i}\) are at their minimums. The minimum H is thus \(\frac{1}{n}\left( n(1 - 1)e^{1 - 1}\right) = 0\).

Proof of Property 2

From (9), we know that H is maximum when \({\mathcal {G}}_i\) is maximum, and its weight \((1 - \Delta \mu _i)\) is maximum, for all \(x_i \in X\). Since \({\mathcal {G}}_i = e^{1 - \Delta \mu _i}\). Hence, \({\mathcal {G}}_i\) attains its maxima at \(\Delta \mu _i = 0\). The weight of \({\mathcal {G}}_i\), i.e. \((1 - \Delta \mu _i)\), is also maximum at \(\Delta \mu _i = 0\), or \(\mu _i = 0.5\). Hence, H attains the maximum value, when \(\mu _i = 0.5\) (or \(\Delta \mu _i = 0\)), \(\forall x_i \in X\). Replacing \(\Delta \mu _i = 0\), \(\forall x_i \in X\) in (9), we obtain \(H_{max} = e\).

Proof of Property 3

We know that \(\Delta \mu _i\) decreases as \(\mu _i\) increases in the interval [0, 0.5], and \(\Delta \mu _i\) increases as \(\mu _i\) increases in the interval [0.5, 1]. Therefore, both \((1 - \Delta \mu _i)\) and \({\mathcal {G}}_i = e^{(1 - \Delta \mu _i)}\) are monotonically increasing for \(\mu _i \in [0, 0.5]\), are monotonically decreasing for \(\mu _i \in [0.5, 1]\), and attain the maximum value at \(\mu _i = 0.5\). Resultantly, \(H = \frac{1}{n}\sum _{i = 1}^n(1 - \Delta \mu _i)(e^{(1 - \Delta \mu _i)})\) has the same trend.

Proof of Property 4

The proof is same as for that of Property 6 of P-Sh entropy.

Appendix C. Proofs for the properties of the new fuzzy entropy

Proof of Property 1

If \(\mu _i = 0\) or 1, then \(\Delta \mu _i = 1\). Hence \({\mathcal {G}}_i = e - e^{\Delta \mu _i} = 0\). Therefore, if \(\mu _i = 0\) or 1, for all \(x_i \in U\), then \(H = \frac{1}{n}\sum (1 - 1)0 = 0\), the minimum value for H (Since \(\Delta \mu _i \in [0, 1]\), \(e - e^{\Delta \mu _i}\) has a lower bound at 0.).

Proof of Property 2

Since \(\mu _i \in [0, 1]\), \(\Delta \mu _i \in [0, 1]\). At \(\mu _i = 0\) or 1, \(\Delta \mu _i = 1\). As \(\mu _i\) moves towards 0.5, \(\Delta \mu _i\) decreases towards 0, and \({\mathcal {G}}_i\) increases towards \(e - 1\), the upper bound for \({\mathcal {G}}_i\). If \(\mu _i = 0.5\), for all \(x_i \in U\), then \(H = \frac{1}{n}\sum _{i = 1}^n(1 - 0)(e - 1) = (e - 1)\), the maximum possible value for H.

Proof of Property 3

We know that \(\Delta \mu _i\) decreases as \(\mu _i\) increases in the interval [0, 0.5], and \(\Delta \mu _i\) increases as \(\mu _i\) increases in the interval [0.5, 1]. For a given \(\gamma \), therefore, both \((1 - \Delta \mu _i)\) and \({\mathcal {G}}_i = (e - e^{(\Delta \mu _i)^{\gamma }})\) are monotonically increasing for \(\mu _i \in [0, 0.5]\), are monotonically decreasing for \(\mu _i \in [0.5, 1]\), and attain the maximum value at \(\mu _i = 0.5\). Resultantly, \(H = \frac{1}{n}\sum _{i = 1}^n(1 - \Delta \mu _i)(e - e^{(\Delta \mu _i)^{\gamma }})\) varies in the same way.

Proof of Property 4

We know that \(\Delta \mu _i = 2*|0.5 - \mu _i|\). Therefore, \(\Delta {\overline{\mu }}_i = 2*|0.5 - (1 - \mu _i)| = |-(0.5 - \mu _i)| = |0.5 - \mu _i| = \Delta \mu _i\). Hence, \(H = {H}^{'}\).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Aggarwal, M. Fuzzy entropy functions based on perceived uncertainty. Knowl Inf Syst 64, 2389–2409 (2022). https://doi.org/10.1007/s10115-022-01700-w

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10115-022-01700-w

Keywords

Navigation