Abstract
Assessing the performance of organizations in the near future has been a challenging problem because of the factors of subjective assessment of the uncertainty. To cater to such applications, a novel form of fuzzy entropy is proposed to unify the concepts of fuzzy entropy and the conventional Shannon’s entropy. The proposed entropy provides a direct and convenient framework to quantify the uncertainty associated with a firm in the near future. It is also shown that the proposed form is amenable in extending the fuzzy entropy functions to the probabilistic-fuzzy domain. Thus, the proposed form of fuzzy entropy finds a special significance in the human decision-making problems. A case study illustrates the applicability of the proposed functions in assessment of future performance of a firm.




Similar content being viewed by others
Data Availability
Authors declare that there is no associated data.
References
Versaci M, Morabito FC (2021) Image edge detection: a new approach based on fuzzy entropy and fuzzy divergence. Int J Fuzzy Syst 23:918–936
Hirota K, Pedrycz W (1986) Subjective entropy of probabilistic sets and fuzzy cluster analysis. IEEE Trans Syst Man Cybern 16(1):173–179
Lee HM, Chen CM, Chen JM, Jou YL (2001) An efficient fuzzy classifier with feature selection based on fuzzy entropy. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 31(3):426–432
Wang X, Dong C (2009) Improving generalization of fuzzy if-then rules by maximizing fuzzy entropy. IEEE Trans Fuzzy Syst 17(3):556–567
Raghu S, Sriraam N, Kumar GP, Hegde AS (2018) A novel approach for real-time recognition of epileptic seizures using minimum variance modified fuzzy entropy. IEEE Trans Biomed Eng 65(11):2612–2621
Romero-Troncoso RJ, Saucedo-Gallaga R, Cabal-Yepez E, Garcia-Perez A, Osornio-Rios RA, Alvarez-Salas R, Miranda-Vidales H, Huber N (2011) Fpga-based online detection of multiple combined faults in induction motors through information entropy and fuzzy inference. IEEE Trans Ind Electron 58(11):5263–5270
Yang M, Nataliani Y (2018) A feature-reduction fuzzy clustering algorithm based on feature-weighted entropy. IEEE Trans Fuzzy Syst 26(2):817–835
Gou XJ, Xu ZS, Liao HC (2017) Hesitant fuzzy linguistic entropy and cross-entropy measures and alternative queuing method for multiple criteria decision making. Inform Sci 388–389:225–246
Aggarwal M (2018) Decision aiding model with entropy-based subjective utility. Inf Sci 501:558–572
Yue C (2017) Entropy-based weights on decision makers in group decision-making setting with hybrid preference representations. Appl Soft Comput 60:737–749
Allahverdyan AE, Galstyan A, Abbas AE, Struzik ZR (2018) Adaptive decision making via entropy minimization. Int J Approx Reason 103:270–287
Gao C, Lai Z, Zhou J, Wen J, Wong WK (2019) Granular maximum decision entropy-based monotonic uncertainty measure for attribute reduction. Int J Approx Reason 104:9–24
Ye J (2010) Multicriteria fuzzy decision-making method using entropy weights-based correlation coefficients of interval-valued intuitionistic fuzzy sets. Appl Math Model 34(12):3864–3870
Yuan J, Luo X (2019) Approach for multi-attribute decision making based on novel intuitionistic fuzzy entropy and evidential reasoning. Comput Ind Eng 135:643–654
Chen S-M, Kuo Li-W, Zou X-Y (2018) Multiattribute decision making based on shannon’s information entropy, nonlinear programming methodology, and interval-valued intuitionistic fuzzy values. Inform Sci 465:404–424
Ye J (2011) Fuzzy cross entropy of interval-valued intuitionistic fuzzy sets and its optimal decision-making method based on the weights of alternatives. Expert Syst Appl 38(5):6179–6183
Zhang Q, Wu J-Z (2011) Multicriteria decision making method based on intuitionistic fuzzy weighted entropy. Expert Syst Appl 38(1):916–922
Taliento M (2021) Corporate valuation: Looking beyond the forecast period through new “fuzzy lenses.” IEEE Trans Eng Manag 68(2):467–482
Tsourveloudis NC, Phillis YA (1998) Fuzzy assessment of machine flexibility. IEEE Trans Eng Manag 45(1):78–87
Girod OA, Triantis KP (1999) The evaluation of productive efficiency using a fuzzy mathematical programming approach: the case of the newspaper preprint insertion process. IEEE Trans Eng Manag 46(4):429–443
Geng Z, Wang Z, Peng C, Han Y (2016) A new fuzzy process capability estimation method based on Kernel function and FAHP. IEEE Trans Eng Manag 63(2):177–188
Cheng MM, Humphreys KA (2016) Managing strategic uncertainty: the diversity and use of performance measures in the balanced scorecard. Manag Audit J 31(4/5):512–534
Kosko B (1986) Fuzzy entropy and conditioning. Inform Sci 40(2):165–174
Zadeh LA (1968) Fuzzy sets and applications, selected papers by L. A. Zadeh, chapter. Probability measures of fuzzy events. John Wiley, pp 45–51
Kaufmann A (1980) Introduction to the theory of fuzzy subsets: fundamental theoretical elements. Academic Press, New York
Fan J, Xie W (1999) Distance measure and induced fuzzy entropy. Fuzzy Set Syst 104(2):305–314
Xie WX, Bedrosian SD (1984) An information measure for fuzzy sets. IEEE Trans Syst Man Cybern SMC-14(1):151–156
Pal NR, Pal SK (1992) Higher order fuzzy entropy and hybrid entropy of a set. Inform Sci 61(3):211–231
Pal NR, Bezdek JC (1994) Measuring fuzzy uncertainty. IEEE Trans Fuzzy Syst 2(2):107–118
Weber S (1984) Measures of fuzzy sets and measures of fuzziness. Fuzzy Set Syst 13(3):247–271
Wang ZX (1984) Fuzzy measures and measures of fuzziness. J Math Anal Appl 104(2):589–601
Knopfmacher J (1975) On measures of fuzziness. J Math Anal Appl 49(3):529–534
Ebanks BR (1983) On measures of fuzziness and their representations. J Math Anal Appl 94(1):24–37
Trillas E, Riera T (1978) Entropies in finite fuzzy sets. Inform Sci 15(2):159–168
Loo SG (1977) Measures of fuzziness. Cybernetics 20:201–210
Emptoz H (1981) Nonprobabilistic entropies and indetermination measures in the setting of fuzzy sets theory. Fuzzy Set Syst 5(3):307–317
Sander W (1989) On measures of fuzziness. Fuzzy Set Syst 29(1):49–55
Liu XC (1992) Entropy, distance measure and similarity measure of fuzzy sets and their relations. Fuzzy Set Syst 52(3):305–318
Dujet C (1983) Separation functions and measures of fuzziness. IFAC Proceedings Volumes, 16(13):91–96. IFAC Symposium on Fuzzy Information, Knowledge Representation and Decision Analysis, Marseille, France, 19–21 July, 1983
Aggarwal M (2019) Bridging the gap between probabilistic and fuzzy entropy. IEEE Trans Fuzzy Syst 28(9):2175–2184
De Luca A, Termini S (1972) A definition of a nonprobabilistic entropy in the setting of fuzzy sets theory. Inf Control 20(4):301–312
Pal NR, Pal SK (1989) Object-background segmentation using new definitions of entropy. IEE Proceedings E - Computers and Digital Techniques 136(4):284–295
Pal NR, Pal SK (1991) Entropy: a new definition and its applications. IEEE Trans Syst Man Cybern 21(5):1260–1270
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of Interest
Authors declare that there is no conflict of interest.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix 1
1.1 Proofs for the Properties of R-LT Entropy
Proof of \(P_1\)
From (6), we know that \(H_{R-Sh, d} = \frac{1}{n}\sum _{i = 1}^{n} (1 - \Upsilon _i)\mathcal {G}_i\) . Since \(\mu _i \in [0, 1]\), \(\Upsilon _i \in [0, 1]\) [see (3)]. Hence \((1 - \Upsilon _i) \ge 0\), and \(\mathcal {G}_i = -\log (\Upsilon _i) \ge 0\). Since both \((1 - \Upsilon _i) \ge 0\), and \(\mathcal {G}_i \ge 0\), \(H_{{R-Sh}, d} \ge 0\).
Proof of \(P_2\)
The proof follows trivially. When \(\forall x_i \in X = 0\) or 1, \(\Upsilon _i = 1\). Therefore, \(\log \left( \Upsilon _i\right) = 0\), \(\forall i\), and hence \(H_{{R-Sh}, d} = 0\).
Proof of \(P_3\)
At \(\mu _i = 0.5\), \(\Upsilon _i = 0\), i.e., the minimum. As \(\mu _i\) moves towards 0.5, \(\Upsilon _i\) decreases. As \(\Upsilon _i\) decreases, \(\mathcal {G}_i\) increases. Therefore, the maximum value for \(\mathcal {G}_i\) is obtained at \(\mu _i = 0.5\). As \(\log\) is undefined at 0, we consider \(\mu _i \rightarrow 0.5\), or \(\Upsilon _i \rightarrow 0\). Similarly, the weight \((1 - \Upsilon _i)\) attains its maximum value at \(\mu _i = 0.5\). Therefore, \(H_{{R-Sh}, d}\) is maximum at \(\mu _i \rightarrow 0.5, \forall x_i \in U\).
Proof of \(P_4\)
We know that \(\Upsilon _i\) decreases as \(\mu _i\) increases in the interval [0, 0.5], and \(\Upsilon _i\) increases as \(\mu _i\) increases in the interval [0.5, 1]. Therefore, both \((1 - \Upsilon _i)\) and \(\mathcal {G}_i = \log (\frac{1}{\Upsilon _i})\) are monotonically increasing for \(\mu _i \in [0, 0.5]\), and is monotonically decreasing for \(\mu _i \in [0.5, 1]\), and attains the maximum value at \(\mu _i = 0.5\). Resultantly, H monotonically increases in \(\mu _i \in [0, 0.5]\), and monotonically decreasing in \(\mu _i \in [0.5, 1]\).
Proof of \(P_5\)
\(H(X,Y) = H(X) + H(Y)\). Let \(\mathcal {M}^X\) and \(\mathcal {M}^Y\) denote the sets for relative membership grades for the fuzzy sets X and Y defined on U. Then from (5), we can write:
Since \(\Upsilon _i^X\) and \(\Upsilon _i^Y\) are independent of each other, we can write:
Proof of \(P_6\)
The proof follows straightforward from P3 and P4.
Proof of \(P_7\)
From (3) : \(\Upsilon _i = 2*|0.5 - \mu _i|\). For a completment set \(\mu _i = 1 - \mu _i, \, \forall i\). Hence, \(\Upsilon _i\) for the complement set: \(\Upsilon _i^{'} = 2*|0.5 - (1 - \mu _i)| = \Upsilon _i\). Therefore \(H = H^{'}\).
Appendix 2
1.1 Proofs for the Properties of R-PP fuzzy entropy
Proof of \(P_1\)
We know that the entropy is minimum when \(\mu _i = 0\) or 1, or \(\Upsilon _i = 1\), \(\forall x_i \in X\). Recalling (8): \(H = \frac{1}{n}\sum _{i = 1}^n (1 - \Upsilon _i)e^{1 - \Upsilon _i}\), we know that H is minimum when both \((1 - \Upsilon _i)\) and \(e^{1 - \Upsilon _i}\) are at their respective minimum values, for all \(x_i \in X\). Since \(\mu _i \in [0, 1]\), we have \(\Upsilon _i \in [0, 1]\). At the upper bound \(\Upsilon _i = 1\), or \(\mu _i = 0\) or 1, both \((1 - \Upsilon _i)\) and \(e^{1 - \Upsilon _i}\) are at their minimums. The minimum H is thus \(\frac{1}{n}\left( n(1 - 1)e^{1 - 1}\right) = 0\).
Proof of \(P_2\)
From (8), we know that H is maximum when \(\mathcal {G}_i\) is maximum, and its weight \((1 - \Upsilon _i)\) is maximum, for all \(x_i \in X\). Since \(\mathcal {G}_i = e^{1 - \Upsilon _i}\). Hence \(\mathcal {G}_i\) attains its maxima at \(\Upsilon _i = 0\). The weight of \(\mathcal {G}_i\), i.e., \((1 - \Upsilon _i)\) is also maximum at \(\Upsilon _i = 0\), or \(\mu _i = 0.5\). Hence, H attains the maximum value, when \(\mu _i = 0.5\) (or \(\Upsilon _i = 0\)), \(\forall x_i \in X\). Replacing \(\Upsilon _i = 0\), \(\forall x_i \in X\) in (8), we obtain \(H_{max} = e\).
Proof of \(P_3\)
We know that \(\Upsilon _i\) decreases as \(\mu _i\) increases in the interval [0, 0.5], and \(\Upsilon _i\) increases as \(\mu _i\) increases in the interval [0.5, 1]. Therefore, both \((1 - \Upsilon _i)\) and \(\mathcal {G}_i = e^{(1 - \Upsilon _i)}\) are monotonically increasing for \(\mu _i \in [0, 0.5]\), and is monotonically decreasing for \(\mu _i \in [0.5, 1]\), and attains the maximum value at \(\mu _i = 0.5\). Resultantly, \(H = \frac{1}{n}\sum _{i = 1}^n(1 - \Upsilon _i)(e^{(1 - \Upsilon _i)})\) has the same trend.
Proof of \(P_4\)
The proof follows directly from that for Property 3.
Proof of \(P_5\)
The proof is same as for that of Property 7 of R-Sh entropy.
Appendix 3
1.1 Proofs for the Properties of the Proposed Fuzzy Entropy
Proof of \(P_1\)
If \(\mu _i = 0\) or 1, then \(\Upsilon _i = 1\). Hence \(\mathcal {G}_i = e - e^{\Upsilon _i} = 0\). Therefore, if \(\mu _i = 0\) or 1, for all \(x_i \in U\), then \(H = \frac{1}{n}\sum (1 - 1)0 = 0\), the minimum value for H (Since \(\Upsilon _i \in [0, 1]\), \(e - e^{\Upsilon _i}\) has a lower bound at 0.).
Proof of \(P_2\)
Since \(\mu _i \in [0, 1]\), \(\Upsilon _i \in [0, 1]\). At \(\mu _i = 0\) or 1, \(\Upsilon _i = 1\). As \(\mu _i\) moves towards 0.5, \(\Upsilon _i\) decreases towards 0, and \(\mathcal {G}_i\) increases towards \(e - 1\), the upper bound for \(\mathcal {G}_i\). If \(\mu _i = 0.5\), for all \(x_i \in U\), then \(H = \frac{1}{n}\sum _{i = 1}^n(1 - 0)(e - 1) = (e - 1)\), the maximum possible value for H.
Proof of \(P_3\)
We know that \(\Upsilon _i\) decreases as \(\mu _i\) increases in the interval [0, 0.5], and \(\Upsilon _i\) increases as \(\mu _i\) increases in the interval [0.5, 1]. For a given a, therefore, both \((1 - \Upsilon _i)\) and \(\mathcal {G}_i = (e - e^{(\Upsilon _i)^a})\) are monotonically increasing for \(\mu _i \in [0, 0.5]\), and is monotonically decreasing for \(\mu _i \in [0.5, 1]\), and attains the maximum value at \(\mu _i = 0.5\). Resultantly, \(H = \frac{1}{n}\sum _{i = 1}^n(1 - \Upsilon _i)(e - e^{(\Upsilon _i)^a})\) varies in the same way.
Proof of \(P_4\)
The proof follows trivially from that for Property 3.
Proof of \(P_5\)
We know that \(\Upsilon _i = 2*|0.5 - \mu _i|\). Therefore, \(\overline{\Upsilon }_i = 2*|0.5 - (1 - \mu _i)| = |-(0.5 - \mu _i)| = |0.5 - \mu _i| = \Upsilon _i\). Hence \(H = {H}^{'}\).
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Aggarwal, M., Krishankumar, R., Ravichandran, K.S. et al. Assessing Potential of Organizations with Fuzzy Entropy. Oper. Res. Forum 4, 11 (2023). https://doi.org/10.1007/s43069-022-00178-0
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s43069-022-00178-0