Abstract
The paper presents fuzzy entropy functions based on perceived vagueness. The proposed entropy functions are based on the principle that different agents may perceive a membership grade differently. The perceived uncertainty for a membership grade is determined through a gain function. In this light, new variants of the popular fuzzy entropy functions are developed. Inspired by these variants, a novel fuzzy entropy function is also developed. The proposed functions are extended to the probabilistic fuzzy domain. A case study is included to illustrate applicability of the work.
Similar content being viewed by others
Notes
It is interesting to note that this is missing from the conventional Luca and Termini entropy. As a result, the more the number of elements considered, the more becomes the entropy. Hence, the entropy increases, if we consider a refined definition of a fuzzy set, which is obviously counter-intuitive.
References
Aggarwal M (2019) Decision aiding model with entropy-based subjective utility. Inf Sci 501:558–572
Aggarwal M (2020) Bridging the gap between probabilistic and fuzzy entropy. IEEE Trans Fuzzy Syst 28(9):2175–2184
Aggarwal M (2021) Attitude-based entropy function and applications in decision-making. Eng Appl Artif Intell 104:104290
Aggarwal M, Hanmandlu M (2016) Representing uncertainty with information sets. IEEE Trans Fuzzy Syst 24(1):1–15
Allahverdyan AE, Galstyan A, Abbas AE, Struzik ZR (2018) Adaptive decision making via entropy minimization. Int J Approx Reason 103:270–287
De Luca A, Termini S (1972) A definition of a nonprobabilistic entropy in the setting of fuzzy sets theory. Inf Control 20(4):301–312
Dujet C (1983) Separation functions and measures of fuzziness. In: IFAC proceedings volumes. IFAC symposium on fuzzy information, knowledge representation and decision analysis, Marseille, France, vol 16, no 13, pp 91–96
Ebanks BR (1983) On measures of fuzziness and their representations. J Math Anal Appl 94(1):24–37
Emptoz H (1981) Nonprobabilistic entropies and indetermination measures in the setting of fuzzy sets theory. Fuzzy Sets Syst 5(3):307–317
Fan J, Weixin Xie (1999) Distance measure and induced fuzzy entropy. Fuzzy Sets Syst 104(2):305–314
Gao C, Lai Z, Zhou J, Wen J, Wong WK (2019) Granular maximum decision entropy-based monotonic uncertainty measure for attribute reduction. Int J Approx Reason 104:9–24
Gou XJ, Xu ZS, Liao HC (2017) Hesitant fuzzy linguistic entropy and cross-entropy measures and alternative queuing method for multiple criteria decision making. Inf Sci 388–389:225–246
Hirota K, Pedrycz W (1986) Subjective entropy of probabilistic sets and fuzzy cluster analysis. IEEE Trans Syst Man Cybern 16(1):173–179
Kaufmann A (1980) Introduction to the theory of fuzzy subsets: fundamental theoretical elements. Academic Press, New York
Knopfmacher J (1975) On measures of fuzziness. J Math Anal Appl 49(3):529–534
Kosko B (1986) Fuzzy entropy and conditioning. Inf Sci 40(2):165–174
Lee HM, Chen CM, Chen JM, Jou YL (2001) An efficient fuzzy classifier with feature selection based on fuzzy entropy. IEEE Trans Syst Man Cybern Part B (Cybern) 31(3):426–432
Liu XC (1992) Entropy, distance measure and similarity measure of fuzzy sets and their relations. Fuzzy Sets Syst 52(3):305–318
Loo SG (1977) Measures of fuzziness. Cybernetics 20:201–210
Pal NR, Bezdek JC (1994) Measuring fuzzy uncertainty. IEEE Trans Fuzzy Syst 2(2):107–118
Pal NR, Pal SK (1989) Object-background segmentation using new definitions of entropy. IEEE Proc Comput Digit Tech 136(4):284–295
Pal NR, Pal SK (1991) Entropy: a new definition and its applications. IEEE Trans Syst Man Cybern 21(5):1260–1270
Raghu S, Sriraam N, Kumar GP, Hegde AS (2018) A novel approach for real-time recognition of epileptic seizures using minimum variance modified fuzzy entropy. IEEE Trans Biomed Eng 65(11):2612–2621
Romero-Troncoso RJ, Saucedo-Gallaga R, Cabal-Yepez E, Garcia-Perez A, Osornio-Rios RA, Alvarez-Salas R, Miranda-Vidales H, Huber N (2011) FPGA-based online detection of multiple combined faults in induction motors through information entropy and fuzzy inference. IEEE Trans Ind Electron 58(11):5263–5270
Sander W (1989) On measures of fuzziness. Fuzzy Sets Syst 29(1):49–55
Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27(4):623–666
Singh P (2020) A neutrosophic-entropy based adaptive thresholding segmentation algorithm: a special application in MR images of Parkinson’s disease. Artif Intell Med 104:101838
Singh P (2020) A neutrosophic-entropy based clustering algorithm (NEBCA) with HSV color system: a special application in segmentation of Parkinson’s disease (PD) MR images. Comput Methods Progr Biomed 189:105317
Singh P, Borah B (2013) High-order fuzzy-neuro expert system for time series forecasting. Knowl Based Syst 46:12–21
Singh P, Dhiman G (2018) Uncertainty representation using fuzzy-entropy approach: special application in remotely sensed high-resolution satellite images (RSHRSIs). Appl Soft Comput 72:121–139
Singh P, Huang Y-P, Lee T-T (2019) A novel ambiguous set theory to represent uncertainty and its application to brain MR image segmentation. In: 2019 IEEE international conference on systems, man and cybernetics (SMC), pp 2460–2465
Trillas E, Riera T (1978) Entropies in finite fuzzy sets. Inf Sci 15(2):159–168
Wang X, Dong C (2009) Improving generalization of fuzzy if–then rules by maximizing fuzzy entropy. IEEE Trans Fuzzy Syst 17(3):556–567
Wang ZX (1984) Fuzzy measures and measures of fuzziness. J Math Anal Appl 104(2):589–601
Weber S (1984) Measures of fuzzy sets and measures of fuzziness. Fuzzy Sets Syst 13(3):247–271
Xie WX, Bedrosian SD (1984) An information measure for fuzzy sets. IEEE Trans Syst Man Cybern SMC 14(1):151–156
Yang M, Nataliani Y (2018) A feature-reduction fuzzy clustering algorithm based on feature-weighted entropy. IEEE Trans Fuzzy Syst 26(2):817–835
Yue C (2017) Entropy-based weights on decision makers in group decision-making setting with hybrid preference representations. Appl Soft Comput 60:737–749
Zadeh LA (1968) Fuzzy sets and applications, selected papers by L. A. Zadeh, chapter probability measures of fuzzy events, pp 45–51. Wiley
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix A. Proofs for the properties of P-Sh entropy
The basic properties are investigated with \(\gamma \) as 1.
Proof of Property 1
From (7), we know that \(H_{\text {P-Sh}, d} = \frac{1}{n}\sum _{i = 1}^{n} (1 - \Delta \mu _i){\mathcal {G}}_i\) . Since \(\mu _i \in [0, 1]\), \(\Delta \mu _i \in [0, 1]\) [see (3)]. Hence \((1 - \Delta \mu _i) \ge 0\), and \({\mathcal {G}}_i = -\log (\Delta \mu _i) \ge 0\). Since both \((1 - \Delta \mu _i) \ge 0\), and \({\mathcal {G}}_i \ge 0\), \(H_{\text {P-Sh}, d} \ge 0\).
Proof of Property 2
The proof follows trivially. When \(\forall x_i \in X = 0\) or 1, \(\Delta \mu _i = 1\). Therefore, \(\log \left( \Delta \mu _i\right) = 0\), \(\forall i\), and hence, \(H_{\text {P-Sh}, d} = 0\).
Proof of Property 3
At \(\mu _i = 0.5\), \(\Delta \mu _i = 0\), i.e. the minimum. As \(\mu _i\) moves towards 0.5, \(\Delta \mu _i\) decreases. As \(\Delta \mu _i\) decreases, \({\mathcal {G}}_i\) increases. Therefore, the maximum value for \({\mathcal {G}}_i\) is obtained at \(\mu _i = 0.5\). As \(\log \) is undefined at 0, we consider \(\mu _i \rightarrow 0.5\), or \(\Delta \mu _i \rightarrow 0\). Similarly, the weight \((1 - \Delta \mu _i)\) attains its maximum value at \(\mu _i = 0.5\). Therefore, \(H_{\text {P-Sh}, d}\) is maximum at \(\mu _i \rightarrow 0.5, \forall x_i \in U\).
Proof of Property 4
\(H[X,Y] = H[X] + H[Y]\). Let \(\Delta {\mathcal {M}}^X\) and \(\Delta {\mathcal {M}}^Y\) denote the sets for relative membership grades for the fuzzy sets X and Y defined on U. Then from (6), we can write:
Since \(\Delta \mu _i^X\) and \(\Delta \mu _i^Y\) are independent of each other, we can write:
Proof of Property 5
The proof follows straightforward from proofs of Properties 3 and 4.
Proof of Property 6
From (3) : \(\Delta \mu _i = 2*|0.5 - \mu _i|\). For a complement set \(\mu _i = 1 - \mu _i, \, \forall i\). Hence, \(\Delta \mu _i\) for the complement set: \(\Delta \mu _i^{'} = 2*|0.5 - (1 - \mu _i)| = \Delta \mu _i\). Therefore, \(H = H^{'}\).
Appendix B. Proofs for the properties of P-PP fuzzy entropy
The basic properties are investigated taking \(\gamma \) as 1.
Proof of Property 1
We know that the entropy is minimum when \(\mu _i = 0\) or 1, or \(\Delta \mu _i = 1\), \(\forall x_i \in X\). Recalling (9): \(H = \frac{1}{n}\sum _{i = 1}^n (1 - \Delta \mu _i)e^{1 - \Delta \mu _i}\), we know that H is minimum when both \((1 - \Delta \mu _i)\) and \(e^{1 - \Delta \mu _i}\) are at their respective minimum values, for all \(x_i \in X\). Since \(\mu _i \in [0, 1]\), we have \(\Delta \mu _i \in [0, 1]\). At the upper bound \(\Delta \mu _i = 1\), or \(\mu _i = 0\) or 1, both \((1 - \Delta \mu _i)\) and \(e^{1 - \Delta \mu _i}\) are at their minimums. The minimum H is thus \(\frac{1}{n}\left( n(1 - 1)e^{1 - 1}\right) = 0\).
Proof of Property 2
From (9), we know that H is maximum when \({\mathcal {G}}_i\) is maximum, and its weight \((1 - \Delta \mu _i)\) is maximum, for all \(x_i \in X\). Since \({\mathcal {G}}_i = e^{1 - \Delta \mu _i}\). Hence, \({\mathcal {G}}_i\) attains its maxima at \(\Delta \mu _i = 0\). The weight of \({\mathcal {G}}_i\), i.e. \((1 - \Delta \mu _i)\), is also maximum at \(\Delta \mu _i = 0\), or \(\mu _i = 0.5\). Hence, H attains the maximum value, when \(\mu _i = 0.5\) (or \(\Delta \mu _i = 0\)), \(\forall x_i \in X\). Replacing \(\Delta \mu _i = 0\), \(\forall x_i \in X\) in (9), we obtain \(H_{max} = e\).
Proof of Property 3
We know that \(\Delta \mu _i\) decreases as \(\mu _i\) increases in the interval [0, 0.5], and \(\Delta \mu _i\) increases as \(\mu _i\) increases in the interval [0.5, 1]. Therefore, both \((1 - \Delta \mu _i)\) and \({\mathcal {G}}_i = e^{(1 - \Delta \mu _i)}\) are monotonically increasing for \(\mu _i \in [0, 0.5]\), are monotonically decreasing for \(\mu _i \in [0.5, 1]\), and attain the maximum value at \(\mu _i = 0.5\). Resultantly, \(H = \frac{1}{n}\sum _{i = 1}^n(1 - \Delta \mu _i)(e^{(1 - \Delta \mu _i)})\) has the same trend.
Proof of Property 4
The proof is same as for that of Property 6 of P-Sh entropy.
Appendix C. Proofs for the properties of the new fuzzy entropy
Proof of Property 1
If \(\mu _i = 0\) or 1, then \(\Delta \mu _i = 1\). Hence \({\mathcal {G}}_i = e - e^{\Delta \mu _i} = 0\). Therefore, if \(\mu _i = 0\) or 1, for all \(x_i \in U\), then \(H = \frac{1}{n}\sum (1 - 1)0 = 0\), the minimum value for H (Since \(\Delta \mu _i \in [0, 1]\), \(e - e^{\Delta \mu _i}\) has a lower bound at 0.).
Proof of Property 2
Since \(\mu _i \in [0, 1]\), \(\Delta \mu _i \in [0, 1]\). At \(\mu _i = 0\) or 1, \(\Delta \mu _i = 1\). As \(\mu _i\) moves towards 0.5, \(\Delta \mu _i\) decreases towards 0, and \({\mathcal {G}}_i\) increases towards \(e - 1\), the upper bound for \({\mathcal {G}}_i\). If \(\mu _i = 0.5\), for all \(x_i \in U\), then \(H = \frac{1}{n}\sum _{i = 1}^n(1 - 0)(e - 1) = (e - 1)\), the maximum possible value for H.
Proof of Property 3
We know that \(\Delta \mu _i\) decreases as \(\mu _i\) increases in the interval [0, 0.5], and \(\Delta \mu _i\) increases as \(\mu _i\) increases in the interval [0.5, 1]. For a given \(\gamma \), therefore, both \((1 - \Delta \mu _i)\) and \({\mathcal {G}}_i = (e - e^{(\Delta \mu _i)^{\gamma }})\) are monotonically increasing for \(\mu _i \in [0, 0.5]\), are monotonically decreasing for \(\mu _i \in [0.5, 1]\), and attain the maximum value at \(\mu _i = 0.5\). Resultantly, \(H = \frac{1}{n}\sum _{i = 1}^n(1 - \Delta \mu _i)(e - e^{(\Delta \mu _i)^{\gamma }})\) varies in the same way.
Proof of Property 4
We know that \(\Delta \mu _i = 2*|0.5 - \mu _i|\). Therefore, \(\Delta {\overline{\mu }}_i = 2*|0.5 - (1 - \mu _i)| = |-(0.5 - \mu _i)| = |0.5 - \mu _i| = \Delta \mu _i\). Hence, \(H = {H}^{'}\).
Rights and permissions
About this article
Cite this article
Aggarwal, M. Fuzzy entropy functions based on perceived uncertainty. Knowl Inf Syst 64, 2389–2409 (2022). https://doi.org/10.1007/s10115-022-01700-w
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10115-022-01700-w