Abstract
Entropy is usually used to measure the uncertainty of uncertain random variables. It has been defined by logarithmic entropy with chance theory. However, this logarithmic entropy sometimes fails to measure the uncertainty of some uncertain random variables. In order to solve this problem, this paper proposes two types of entropy for uncertain random variables: sine entropy and partial sine entropy, and studies some of their properties. Some important properties of sine entropy and partial sine entropy, such as translation invariance and positive linearity, are obtained. In addition, the calculation formulas of sine entropy and partial sine entropy of uncertain random variables are given.
1. Introduction
Entropy is a parameter describing the disorder of objective things. Shannon [1] believes that information is the elimination or reduction of uncertainty in people’s understanding of things. He calls the degree of uncertainty information entropy. Since then, some scholars have studied Shannon entropy. Using fuzzy set theory, Zadeh [2] introduced fuzzy entropy to quantify the number of fuzziness. Following that, De Luca and Termini [3] proposed a definition of fuzzy entropy, that is, the uncertainty related to the fuzzy set. After that, many studies involved the definition and application of fuzzy entropy, such as Bhandary pal [4], Pal and PAL [5], Pal and Bezdek [6]. Furthermore, Li and Liu [7] put forward the definition of entropy of fuzzy variable.
In 2007, in order to study the uncertainty related to belief degree, Liu [8] established uncertainty theory. As a branch of mathematics, Liu [9] improved the theory in 2009. Uncertain variable was defined [10]. After that, Liu [9] gave a definition of expect value of uncertain variable, and Liu and Ha [11] gave a formula for calculating the expected value of uncertain variable function. Liu [8] proposed some formulas by uncertainty distribution for calculating variance and moment. Yao [12], and Sheng and Samarjit [13] proposed a formula using inverse uncertainty distribution for calculating variance and moment. After that, Liu [8] proposed a concept of logarithmic entropy of uncertain variables. Later, Dai and Chen [14] established a formula to calculate the entropy through the inverse of uncertainty distribution. In addition, Chen and Dai [15] studied the maximum entropy principle. After that, Dai [16] proposed quadratic entropy. Yao et al. [17] proposed sine entropy of uncertain variables.
We know that in order to deal with the number of uncertainties, we have two mathematical tools: probability theory and uncertainty theory. The probability theory is a powerful tool for modeling frequency through samples, and uncertainty theory is another tool for modeling belief degree. However, when the system becomes more and more complex, it creates both uncertainty and randomness. In 2013, Liu [18] established chance theory for modeling the systems. Liu [19] also proposed and studied the basic concepts of chance measure, which is a monotonically increasing set function and satisfies self-duality. Hou [20] proved that the chance measure satisfies sub-additivity. Liu [19] also put forward some basic concepts, including uncertain random variable, and its chance distribution and digital features, etc. Furthermore, Sheng and Yao [21] provided a formula for calculating the variance. Sheng et al. [22] proposed the concept of logarithmic entropy in 2017. After that, Ahmadzade et al. [23] proposed the concept of quadratic entropy, and Ahmadzade et al. [24] studied the question of partial logarithmic entropy.
Since logarithmic entropy may not be able to measure the uncertainty in some cases. Therefore, in order to further improve this problem, this paper will propose two new entropies for uncertain random variables, namely sine entropy and partial sine entropy, and discuss their properties. Furthermore, the calculation formulas of sine entropy and partial sine entropy are obtained by using chance theory. Section 2 reviews some basic concepts of chance theory. Section 3 introduces the concept and basic properties of sine entropy of uncertain random variables. Furthermore, this paper will also propose the concept of partial sine entropy and discuss its properties in Section 4. Finally, we will give a summary in Section 5.
2. Preliminaries
In this part, we review some basic concepts of chance theory
Definition 1
(Liu [18]). Let be an uncertainty space and be a probability space. Then, the product is called a chance space. Let be an uncertain random event. Then, the chance measure of Θ is defined as
The chance measure satisfies: (i) Normality [18]: ; (ii) Duality [18]: for and event ; (iii) Monotonicity [18]: for any real number set (iv) Subadditivity [20]: for a sequence of events
Definition 2
(Liu [18]). A function ξ is called an uncertain random variable if it is from a chance space to the set of real numbers such that is an event in for any Borel set B of real numbers.
Definition 3
(Liu [18]). Let ξ be an uncertain random variable. Then, the function
is a chance distribution of ξ.
Theorem 1
(Liu [19]). Let be probability distributions of independent random variables , and let be uncertainty distributions of independent uncertain variables , respectively. Then, chance distribution of is
where is the uncertainty distribution of for any and is determined by
Definition 4
(Sheng et al. [22]). Let be chance distribution of an uncertain random variable ξ. Then, the entropy of ξ is defined by
where .
Definition 5
(Ahmadzade et al. [23]). Let be independent random variables, and be uncertain variables. Then, partial entropy of is defined by
where and is uncertainty distribution of for any real numbers .
3. Sine Entropy of Uncertain Random Variables
Since logarithmic entropy may not be able to measure the uncertainty of uncertain random variables in some case. Therefore, we will propose a sine entropy of uncertain random variables as a supplement to measure the uncertainty in fail of the logarithmic entropy, as shown below.
Definition 6.
Let be chance distribution of an uncertain random variable ξ. Then, we define sine entropy
Obviously, in the following, is a symmetric function with , and reaches its unique maximum 1 at , and it is strictly increasing in and strictly decreasing in . By Definition 6, we have . If , c is a special uncertainty, that is a constant, then and . Set , If chance distribution of , then .
Remark 1.
We can find that the sine entropy of uncertain random variables is invariant under any translations.
Example 1.
Let Ψ be a probability distribution of random variable η, and let Υ be an uncertainty distribution of uncertain variable τ. Then, sine entropy of the sum is
Example 2.
Let Ψ be a probability distribution of random variable , and let Υ be an uncertainty distribution of uncertain variable . Then, sine entropy of the product is
Example 3.
Let Ψ be a probability distribution of random variable η , and let Υ be an uncertainty distribution of uncertain variable τ.Then, sine entropy of the minimum is
Example 4.
Let Ψ be a probability distribution of random variable η, and let Υ be an uncertainty distribution of uncertain variable τ. Then, sine entropy of the maximum is
Theorem 2.
Let be an inverse chance distribution of uncertain random variable ξ. Then, sine entropy is
Proof.
According to known conditions that has an inverse chance distribution , then has a chance distribution . We can obtain
then the sine entropy of can be obtained:
We can also obtain the following formula by Fubini theorem:
The proof is completed from this theorem. □
Remark 2.
Theorem 2 provides a new method to calculate sine entropy of an uncertain random variable when the inverse chance distribution exists.
Theorem 3.
Let be probability distributions of independent random variables , respectively, and let be independent uncertain variables. Then, the sine entropy of is
where for any real numbers is the uncertainty distribution of .
Proof.
For any real numbers we know that has a chance distribution by Theorem 1,
where is the uncertainty distribution of . By definition of sine entropy, we have
Thus, we proved this theorem. □
Corollary 1.
Let be strictly decreasing with respect to and strictly increasing with respect to . If are continuous, then the sine entropy of ξ is
Proof.
By Theorem 1, we know that the chance distribution of is
Then, we have
Thus, we can obtain
by Theorem 3. The proof is completed from this corollary. □
Corollary 2.
Let be strictly decreasing with respect to and strictly increasing with respect to . If are regular, then the sine entropy of ξ is
where may be determined by its inverse uncertainty distribution , that is
Proof.
By Theorem 1, for any real numbers we know that the chance distribution of is
where is the uncertainty distribution of . From the assumption that be strictly decreasing with respect to and strictly increasing with respect to . It follows that may be determined by its inverse uncertainty distribution when are regular, that is
From Theorem 3, we can obtain
where and may be determined by its inverse uncertainty distribution that is equal to
The proof is completed of this corollary. □
4. Partial Sine Entropy of Uncertain Random Variables
The concept of sine entropy of uncertain random variables are proposed theoretically by using chance theory. However, sometimes we need to know how much the sine entropy of uncertain random variables is related to uncertain variables? To answer this question, following that, we will define a new concept of partial sine entropy of uncertain random variables to measure how much the sine entropy of uncertain random variables is related to uncertain variables. Therefore, we propose the concept of partial sine entropy as following as.
Definition 7.
Let be uncertain variables, and let be independent random variables. Then, the partial sine entropy of ξ is
where for any real numbers , is the uncertainty distribution of .
Theorem 4.
Let be probability distributions of independent random variables , respectively, let be independent uncertain variables. If f is a measurable function, then the partial sine entropy of is
where for any real numbers is the inverse uncertainty distribution of ,⋯, , ,,⋯,.
Proof.
We know that is a derivable function with Thus, we have
then the partial sine entropy is
By the Fubini theorem, we have
The proof is completed from this theorem. □
Example 5.
Let Ψ be a probability distribution of random variable η, let Υ be an uncertainty distribution of uncertain variable τ. Then, the partial sine entropy of the sum is
Proof.
It is obvious that the inverse uncertain distribution of uncertain variable is . By Theorem 4, we have
Thus, the proof is finished. □
Example 6.
Let Ψ be a probability distribution of random variable η, let Υ be an uncertainty distribution of uncertain variable τ. Then, the partial sine entropy of the product is
Proof.
It is obvious that the inverse uncertain distribution of uncertain variable is . By Theorem 4, we have
Thus, the proof is finished. □
Example 7.
Let and be two uncertainty distributions of uncertain variables and , respectively, and let and be two probability distributions of random variables and , respectively. Set and , then
Proof.
It is obvious that the inverse uncertain distributions of uncertain variables and are , and . By Theorem 4, we can obtain
Thus, the proof is finished. □
Example 8.
Let and be two uncertainty distributions of uncertain variables and , respectively, and let and be two probability distributions of random variables and , respectively. Set and , then
Proof.
It is obvious that the inverse uncertain distributions of uncertain variables and are , and By Theorem 4, we can obtain
Thus, the proof is finished. □
Theorem 5.
Let be independent uncertain variables, and let be independent random variables. Set , , ⋯, and If is strictly increasing with respect to and strictly decreasing with respect . Then, the partial sine entropy of is
where or are the inverse uncertainty distribution of for any real numbers .
Proof.
It is obvious that the inverse uncertain distribution of uncertain variable, we have
By Theorem 4, we can obtain
The proof is completed from this theorem. □
Theorem 6.
Let be independent uncertain variables, and let be independent random variables. Set , , ⋯, and For any real numbers we have
Proof.
This problem will be proved by three steps.
Step 1: We prove . If , then has an inverse uncertainty distribution , where is the inverse uncertainty distribution of . We have
If , then has an inverse uncertainty distribution , where is the inverse uncertainty distribution of . We have
If , then we immediately have Thus, we always have
Step 2: We prove
The inverse uncertainty distribution of is
We can obtain
by Theorem 5.
Step 3: By Step 1 and Step 2, for any real numbers , , we can obtain
Thus, the proof is finished. □
Remark 3.
From Theorem 6, we see that the partial sine entropy is positive linearity any real numbers.
5. Conclusions
Chance theory is a mathematical method to research the phenomenon with uncertainty and randomness. The entropy of uncertain random variables is very important and necessary for measuring uncertainty. In this paper, two new definitions of sine entropy and partial sine entropy were proposed, and some properties of sine entropy and partial sine entropy were studied. Using chance distribution or inverse chance distribution, some calculation formulas of sine entropy and partial sine entropy of uncertain random variables were derived. The partial sine entropy were investigated, which it was translation invariance and positive linearity.
Author Contributions
Conceptualization, G.S. and R.Z.; methodology, Y.S.; formal analysis, Y.S.; investigation, G.S.; writing—original draft preparation, G.S.; writing—review and editing, Y.S.; funding acquisition, G.S. All authors have read and agreed to the published version of the manuscript.
Funding
This work was supported by National Natural Science Foundation of China (Grants Nos. 12061072 and 62162059).
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
Not applicable.
Acknowledgments
The authors especially thank the editors and anonymous referees for their kindly review and helpful comments. In addition, the authors would like to acknowledge the gracious support of this work by the National Natural Science Foundation of China—Joint Key Program of Xinjiang (Grants No. U1703262).
Conflicts of Interest
We declare that we have no relevant or material financial interests that relate to the research described in this paper. The manuscript has neither been published before, nor has it been submitted for consideration of publication in another journal.
References
- Shannon, C. The mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 373–423. [Google Scholar] [CrossRef]
- Zadeh, L.A. Probability measures of fuzzy events. J. Math. Anal. Appl. 1968, 23, 421–427. [Google Scholar] [CrossRef]
- De Luca, A.; Termini, S. A definition of nonprobabilitistic entropy in the setting of fuzzy sets theory. Inf. Control. 1972, 20, 301–312. [Google Scholar] [CrossRef]
- Bhandary, D.; Pal, N. Some new information measures for fuzzy sets. Inf. Sci. 1993, 67, 209–228. [Google Scholar] [CrossRef]
- Pal, N.; Pal, K. Object background segmentation using a new definition of entropy. IEEE Proc. Comput. Digit. Tech. 1989, 136, 284–295. [Google Scholar] [CrossRef]
- Pal, N.R.; Bezdek, J. Measuring fuzzy uncertainty. IEEE Trans. Fuzzy Syst. 1994, 2, 107–118. [Google Scholar] [CrossRef]
- Li, P.K.; Liu, B. Entropy of credibility distributions for fuzzy variables. IEEE Trans. Fuzzy Syst. 2008, 16, 123–129. [Google Scholar]
- Liu, B. Uncertainty Theory, 2nd ed.; Springer: Berlin, Germany, 2007. [Google Scholar]
- Liu, B. Some research problems in uncertainty theory. J. Uncertain Syst. 2009, 3, 3–10. [Google Scholar]
- Liu, B. Uncertainty Theory: A Branch of Mathematics for Modeling Human Uncertainty; Springer: Berlin, Germany, 2010. [Google Scholar]
- Liu, Y.H.; Ha, M.H. Expected value of function of uncertain variables. J. Uncertain Syst. 2010, 4, 181–186. [Google Scholar]
- Yao, K. A formula to calculate the variance of uncertain variable. Soft Comput. 2015, 19, 2947–2953. [Google Scholar] [CrossRef]
- Sheng, Y.H.; Samarjit, K. Some results of moments of uncertain variable through inverse uncertainty distribution. Fuzzy Optim. Decis. Mak. 2015, 14, 57–76. [Google Scholar] [CrossRef]
- Dai, W.; Chen, X.W. Entropy of function of uncertain variables. Math. Comput. Model. 2012, 55, 754–760. [Google Scholar] [CrossRef]
- Chen, X.W.; Dai, W. Maximum entropy principle for uncertain variables. Int. J. Fuzzy Syst. 2011, 13, 232–236. [Google Scholar]
- Dai, W. Quadratic entropy of uncertain variables. Soft Comput. 2018, 22, 5699–5706. [Google Scholar] [CrossRef]
- Yao, K.; Gao, J.W.; Dai, W. Sine entropy for uncertain variable. Int. J. Uncertain. Fuzziness-Knowl.-Based Syst. 2013, 21, 743–753. [Google Scholar] [CrossRef]
- Liu, Y.H. Uncertain random variables: A mixture of uncertainty and randomness. Soft Comput. 2013, 17, 625–634. [Google Scholar] [CrossRef]
- Liu, Y.H. Uncertain random programming with applications. Fuzzy Optim. Decis. Mak. 2013, 12, 153–169. [Google Scholar] [CrossRef]
- Hou, Y.C. Subadditivity of Chance Measure. J. Uncertain. Anal. Appl. 2014, 2, 14. [Google Scholar] [CrossRef]
- Sheng, Y.H.; Yao, K. Some formulas of variance of uncertain random variable. J. Uncertain. Anal. Appl. 2014, 2, 12. [Google Scholar] [CrossRef]
- Sheng, Y.H.; Shi, G.; Ralescu, D. Entropy of uncertain random variables with application to minimum spanning tree problem. Int. J. Uncertain. Fuzziness-Knowl.-Based Syst. 2017, 25, 497–514. [Google Scholar] [CrossRef]
- Ahmadzade, H.; Gao, R.; Zarei, H. Partial quadratic entropy of uncertain random variables. J. Uncertain. Syst. 2016, 10, 292–301. [Google Scholar]
- Ahmadzade, H.; Gao, R.; Dehghan, M.; Sheng, Y.H. Partial entropy of uncertain random variables. J. Intell. Fuzzy Syst. 2017, 33, 105–112. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).