Abstract
Fractional calculus (FC) is the area of calculus that generalizes the operations of differentiation and integration. FC operators are non-local and capture the history of dynamical effects present in many natural and artificial phenomena. Entropy is a measure of uncertainty, diversity and randomness often adopted for characterizing complex dynamical systems. Stemming from the synergies between the two areas, this paper reviews the concept of entropy in the framework of FC. Several new entropy definitions have been proposed in recent decades, expanding the scope of applicability of this seminal tool. However, FC is not yet well disseminated in the community of entropy. Therefore, new definitions based on FC can generalize both concepts in the theoretical and applied points of view. The time to come will prove to what extend the new formulations will be useful.
1. Introduction
In recent decades, the generalization of the concepts of differentiation [,,,] and entropy [,,,] have received considerable attention. In the first case we may cite the fractional calculus (FC) [,]. FC was introduced by Leibniz in the scope of mathematics by the end of the 17th century, but only recently found application in biology [,], physics [,] and engineering [,], among others [,]. The concept of entropy was introduced by Clausius [] and Boltzmann [] in the field of thermodynamics. Later, entropy was also explored by Shannon [] and Jaynes [] in the context of information theory. Meanwhile, both topics evolved considerably, motivating the formulation of fractional operators [,] and entropy indices [,,,,,,,,,,,,,]. These generalizations extend the application of the two mathematical tools and highlight certain characteristics, such as the power-law behavior, non-locality and long range memory [,].
This paper reviews the concept of entropy in the framework of FC. In fact, FC is not yet well disseminated among the community of entropy and, therefore, new definitions based on FC may expand the scope of this powerful tool. To the authors’ best knowledge, new entropy definitions are welcomed by the scientific community, somehow contrary to what happens with recent fractional operators. Consequently, the manuscript does not intend to assess the pros or the cons of the distinct formulations for some given problem. In a similar line of thought, the analysis of entropy-based indices proposed in the literature for comparing or characterizing some phenomena or probability distributions are outside the focus of this paper. Interested readers can obtain further information on divergence measures [] and mutual information [], as well as for sample [], approximate [], permutation [], spectral [], and fuzzy [] entropies, among others []. Indeed, the main idea of this paper is to review the concept of fractional entropy and to present present day state of its development.
The paper is organized as follows. Section 2 presents the fundamental concepts of FC. Section 3 introduces different entropies with one, two and three parameters. Section 4 reviews the fractional-order entropy formulations. Section 5 compares the different formulations for four well-known distributions. Section 6 assesses the impact of the fractional entropies and analyses their main areas of application. Finally, Section 7 outlines the main conclusions.
2. Fractional-Order Derivatives and Integrals
FC models capture non-local effects, useful in the study of phenomena with long range correlations in time or space.
Let us consider the finite interval , with and , and let , with . The Euler’s gamma function is denoted by and the operator calculates the integer part of the argument. Several definitions of fractional derivatives were formulated [,,]. A small set is presented in the follow-up, which includes both historically relevant and widely used definitions:
- The left-side and the right-side Caputo derivatives,
- The left-side and the right-side Grünwald-Letnikov derivatives,
- The Hadamard derivative,
- The left-side and right-side Hilfer derivatives of type ,where and denote the left-side and right-side Riemann-Liouville fractional integrals of order , respectively, defined by:
- The Karcı derivative
- The Liouville, the left-side and the right-side Liouville derivatives,
- The Marchaud, the left-side and the right-side Marchaud derivatives,
- The left-side and the right-side Riemann-Liouville derivatives,
- The Riesz derivative,
- The local Yang derivative,
Often, the Caputo formulation is applied in physics and numerical integration, the Riemann-Liouville in calculus, and the Grünwald-Letnikov in engineering, signal processing and control. These classical definitions are the most frequently used by researchers. In what concerns the mathematical pros and cons of the Karcı and the Yang derivatives, readers may visit [,] and references therein. In fact, it should be noted that some formulations need some careful reflection and are the matter of some controversy, since many authors do not consider them as fractional operators [,,]. Nevertheless, the debate about what it really means the term ‘fractional derivative’ is still ongoing among contemporary mathematicians [].
3. The Concept of Entropy
Let us consider a discrete probability distribution , with and . The Shannon entropy, , of distribution is defined as:
and represents the expected value of the information content given by . Therefore, for the uniform probability distribution we have , , and the Shannon entropy takes its maximum value , yielding the Boltzmann formula, up to a multiplicative factor, k, which denotes the Boltzmann constant.
The Rényi and Tsallis entropies are one-parameter generalizations of (21) given by, respectively:
The entropies and reduce to the Shannon formulation when . The Rényi entropy has an inverse power law equilibrium distribution [], satisfying the zero-th law of thermodynamics []. It is important in statistics and ecology to quantify diversity, in quantum information to measure entanglement, and in computer science for randomness extraction. The Tsallis entropy was proposed in the scope of nonextensive statistical mechanics and has found application in the field of complex dynamics, in diffusion equations [] and Fokker-Planck systems [].
Other one-parameter entropies are the Landsberg-Vedral and Abe formulations [,]:
Expression (24) is related to the Tsallis entropy by , and is often known as normalized Tsallis entropy. Expression (25) is a symmetric modification of the Tsallis entropy, which is invariant to the exchange , and we have .
The two-parameter Sharma-Mittal entropy [] is a generalization of the Shannon, Tsallis and Rényi entropies, and is defined as follows:
The Sharma-Mittal entropy reduces to the Rényi, Tsallis and Shannon’s formulations for the limits , and , respectively.
Examples of three-parameter formulations consist of the gamma and the Kaniadakis entropies, and , respectively. The gamma entropy is given by []:
where e denotes the Napier constant, represents the generalized incomplete gamma function, defined by:
and is the upper incomplete gamma function.
The entropy follows the first three Khinchin axioms [,,] within the parameter regions defined by (29) and (30):
Different combinations of the parameters yield distinct entropy formulations []. For example, if we set , then we recover the Tsallis entropy, while for , , we obtain the Shannon entropy.
The Kaniadakis entropy belongs to a class of trace-form entropies given by []:
where is a strictly increasing function defined for positive values of the argument, noting that . The function can be viewed as a generalization of the ordinary logarithm [] that, for three-parameter, yields:
Therefore, the Kaniadakis entropy, , can be expressed as:
The Entropy is Lesche [] and thermodynamically [] stable for . Distinct combinations of the parameters yield several entropy formulations []. For example, if we set or , , then expression (33) yields the Tsallis and the Shannon entropies, respectively.
Other entropies can be found in the literature [,], but a thorough review of all proposed formulations is out of the scope of this paper.
4. Fractional Generalizations of Entropy
It was noted [] that the Shannon and Tsallis entropies have the same generating function and that the difference in the Formulas (21) and (23) is just due to the adopted differentiation operator. In fact, using the standard first-order differentiation, , we obtain the Shannon entropy:
while adopting the Jackson q-derivative [], , , yields the Tsallis entropy []:
Other expressions for entropy can be obtained by adopting additional differentiation operators.
In 2001, Akimoto and Suzuki [] proposed the one-parameter fractional entropy, , given by:
where is the Riemann-Liouville operator (17), with .
The expressions (36) and (17) yield:
where denotes the confluent hypergeometric function of the first kind []:
It can be shown that [] has the concavity and non-extensivity properties. In the limit , it obeys positivity and gives the Shannon entropy, .
In 2009, Ubriaco introduced a one-parameter fractional entropy, , given by []:
where is the Riemann-Liouville left-side derivative (17) with .
Therefore, we obtain:
Performing the integration and taking the limit , it yields:
The Ubriaco entropy (41) is thermodynamically stable and obeys the same properties of the Shannon entropy, with the exception of additivity. When , we recover the Shannon entropy.
In 2012, Yu et al. [] formulated a one-parameter fractional entropy by means of the simple expression:
where the operator is the left-side Riemann-Liouville integral (8), with . Expression (42) obeys the concavity property and is an extension and generalization of the Shannon entropy.
Another fractional entropy was derived in 2014 by Radhakrishnan et al. [], being given by:
The two-parameter expression (43) was inspired in (41) and the entropy (44) derived by Wang in the context of the incomplete information theory []:
where and denotes the q-expectation that characterizes incomplete normalization.
The entropy (43) is considered a fractional entropy in a fractal phase space in which the parameters q and are associated with fractality and fractionality, respectively. In the limit, when (i) Equation (43) reduces to (41), (ii) recovers , and (iii) expression (43) yields the standard Shannon formula (21).
In 2014, Machado followed a different line of thought [], thinking of Shannon information as a function of order zero lying between the integer-order cases and . In the perspective of FC, this observation motivated the formulation of information and entropy of order as []:
where denotes a fractional derivative operator, and represent the digamma function.
The one-parameter fractional entropy (46) fails to obey some of the Khinchin axioms with exception of the case that leads to the Shannon entropy []. This behavior is in line with what occurs in FC, where fractional derivatives fail to obey some of the properties of integer-order operators [].
Expression (46) was generalized by Jalab et al. [] in the framework of local FC []. A adopting (20), the following expression was proposed:
Equation (47) decreases from 1 to , . Therefore, we have:
In 2016, Karcı [] proposed the fractional derivative (10), based on the concept of indefinite limit and the l’Hôpital’s rule. Adopting , and using (10) into (34), he derived the following expression for fractional entropy []:
In 2019, Ferreira and Machado [] presented a new formula for the entropy based on the work of Abe [] and Ubriaco []. They start by the definition of left-side Liouville fractional derivative of a function f with respect to another function g, with , given by:
Therefore, we have:
which applying , for , results in:
In 2019, Machado and Lopes [] proposed two fractional formulations of the Rényi entropy, and . Their derivation adopts a general averaging operator, instead of the linear one that is assumed for the Shannon entropy (21). Let us consider a monotonic function with inverse . Therefore, for a set of real values , , with probabilities , we can define a general mean [] associated with as:
Applying (56) to the Shannon entropy (21) we obtain:
where is a Kolmogorov–Nagumo invertible function []. If the postulate of additivity for independent events is considered in (56), then only two functions are possible, consisting of and , with . For we get the ordinary mean and we verify that . For we have the expression:
which gives the Rényi entropy:
In the limit, when , both and yield (22).
5. Comparison of the Fractional-Order Entropies
In this section we use the fractional entropy formulas to compute the entropy both of abstract and real-world data series.
5.1. Fractional-Order Entropy of Some Probability Distributions
We calculate the entropy of four well-known probability distributions, namely those of Poisson, Gaussian, Lévy and Weibull. We consider these cases just with the purpose of illustrating the behavior of the different formulations. Obviously other cases could be considered, but we limit the number for the sake of parsimony. Firstly, we present the results obtained with the one-parameter entropies , , , , , and . Then, we consider the two-parameter formulations , and . Table 1 summarizes the constants adopted for the distributions and the intervals of variation of the entropy parameters.
Table 1.
The constants adopted for the probability distributions and the intervals of variation of the entropy parameters.
Figure 1 depicts the values of , , , , , and versus . We verify that in the limits, either or , the values of the Shannon entropy are calculated as 2.087, 5.866, 4.953 and 5.309, respectively. Moreover, it can be seen that and are very close to each other, does not obey positivity, diverges at small values of , has a maximum at values of close to 0.6 and diverges as .
Figure 1.
The values of , , , , , and versus for the (a) Poisson, (b) Gaussian, (c) Lévy and (d) Weibull distributions.
Figure 2 portraits the values of , and versus and . We verify that in the domain considered the entropies vary slightly and do not diverge.
Figure 2.
The values of , and versus and for the (a–c) Poisson, (d–f) Gaussian, (g–i) Lévy and (j–l) Weibull distributions.
5.2. Fractional-Order Entropy of Real-World Data
We calculate the entropy of a real-world time series, namely the Dow Jones Industrial Average (DJIA) financial index. The DJIA raw data are available at the Yahoo Finance website (https://finance.yahoo.com/). Herein, we consider the stock closing values in the time period from 1 January 1987 up to 24 November 2018, with one-day sampling interval. Occasional missing values, as well as values corresponding to closing days, are estimated using linear interpolation. The processed DJIA time series, , points, is used to construct a histogram of relative frequencies, , with bins equally spaced and non-overlapping, for estimating the probability distribution of x.
Figure 3a depicts the values of , , , , , and versus . We verify that, as shown in Section 5.1, and yield similar results, diverges for small values of , and has a maximum at values of close to 0.6, diverging when .
Figure 3.
The entropy of the DJIA stock index for daily closing values in the time period from 1 January 1987 up to 24 November 2018, with one-day sampling interval: (a) , , , , , and versus ; (b–d) , and versus and .
Figure 3b–d show the values of , and versus and , yielding results of the same type as before.
6. Impact and Applications of the Fractional-Order Entropies
To assess the impact of the fractional-order entropies on the scientific community, we consider the number of citations received by the nine papers that first proposed them. Table 1 summarizes the results obtained from the database Scopus on 7 November 2020 (www.scopus.com). We verify that those nine papers were cited 218 times by 170 distinct papers, and that the expressions proposed by Ubriaco and Machado received more attention.
To unravel the main areas of application of the fractional entropies, we use the VOSviewer (https://www.vosviewer.com/), which allows the construction and visualization of bibliometric networks []. The bibliometric data of the 170 papers that cite the nine papers that present the fractional-order entropies were collected from Scopus for constructing Table 2, and are the input information to the VOSviewer. The co-occurrence of the authors’ keywords in the 170 papers is analyzed, with the minimum value of co-occurrence of each keyword set to 3. Figure 4 depicts the generated map. We verify the emergence of six clusters, . At the top, the light-blue cluster, , includes the fields of finance and finance time series analysis, while the light-green one, , encompasses a variety of areas, such as solvents, fractals, commerce, and stochastic systems, tightly connected to some entropy-based complexity measures. On the right, the dark-green cluster, , includes the areas of fault detection and image processing. At the bottom of the map, the red cluster, , highlights the fields of chromosome and DNA analysis, while the dark-blue, , one emphasizes some clustering and visualization techniques, as multidimensional scaling and hierarchical clustering. On the left, the magenta cluster, , includes keywords not related with applications.
Table 2.
Citations received by the nine papers that proposed the fractional-order entropies, according to the database Scopus on 7 November 2020.
Figure 4.
The map of co-occurrence of the authors’ keywords in the 170 papers extracted from Scopus for constructing Table 2. The minimum value of co-occurrence of each keyword is 3. The clusters are represented by .
In summary, we conclude that the fractional entropies were applied to a considerable number of distinct scientific areas and that we may foresee a promising future for their development by exploring the synergies of the two mathematical tools. The prevalence of some proposals, from the point of view of citations, may be due to the time elapsed since their formulation. Indeed, more recent formulations had not yet sufficient time to disseminate in the community. Another reason may have to do with the type and audience of journal where they were published. Nonetheless, a full bibliometric analysis is not the leitmotif of the present paper.
7. Conclusions
This paper reviewed the concept of entropy in the framework of FC. To the best of the authors’ knowledge the fractional entropies proposed so far were included in this review. The different formulations result from the adopted (i) fractional-order operator or (ii) generating function. In general such entropies are non-extensive and converge to the classical Shannon entropy for certain values of their parameters. The fractional entropies have found applications in the area of complex systems, where the classical formulations revealed some limitations. The FC brings a shinny future in further developments of entropy and its applications.
Author Contributions
A.M.L. and J.A.T.M. conceived, designed and performed the experiments, analyzed the data and wrote the paper. Both authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Oldham, K.; Spanier, J. The Fractional Calculus: Theory and Application of Differentiation and Integration to Arbitrary Order; Academic Press: New York, NY, USA, 1974. [Google Scholar]
- Samko, S.; Kilbas, A.; Marichev, O. Fractional Integrals and Derivatives: Theory and Applications; Gordon and Breach Science Publishers: Amsterdam, The Netherlands, 1993. [Google Scholar]
- Miller, K.; Ross, B. An Introduction to the Fractional Calculus and Fractional Differential Equations; John Wiley and Sons: New York, NY, USA, 1993. [Google Scholar]
- Kilbas, A.; Srivastava, H.; Trujillo, J. Theory and Applications of Fractional Differential Equations; North-Holland Mathematics Studies; Elsevier: Amsterdam, The Netherlands, 2006; Volume 204. [Google Scholar]
- Plastino, A.; Plastino, A.R. Tsallis entropy and Jaynes’ Information Theory formalism. Braz. J. Phys. 1999, 29, 50–60. [Google Scholar] [CrossRef]
- Li, X.; Essex, C.; Davison, M.; Hoffmann, K.H.; Schulzky, C. Fractional Diffusion, Irreversibility and Entropy. J. Non-Equilib. Thermodyn. 2003, 28, 279–291. [Google Scholar] [CrossRef]
- Mathai, A.; Haubold, H. Pathway model, superstatistics, Tsallis statistics, and a generalized measure of entropy. Phys. A Stat. Mech. Appl. 2007, 375, 110–122. [Google Scholar] [CrossRef]
- Anastasiadis, A. Special Issue: Tsallis Entropy. Entropy 2012, 14, 174–176. [Google Scholar] [CrossRef]
- Tenreiro Machado, J.A.; Kiryakova, V. Recent history of the fractional calculus: Data and statistics. In Handbook of Fractional Calculus with Applications: Basic Theory; Kochubei, A., Luchko, Y., Eds.; De Gruyter: Berlin, Germany, 2019; Volume 1, pp. 1–21. [Google Scholar]
- Machado, J.T.; Galhano, A.M.; Trujillo, J.J. On development of fractional calculus during the last fifty years. Scientometrics 2014, 98, 577–582. [Google Scholar] [CrossRef]
- Ionescu, C. The Human Respiratory System: An Analysis of the Interplay between Anatomy, Structure, Breathing and Fractal Dynamics; Series in BioEngineering; Springer: London, UK, 2013. [Google Scholar]
- Lopes, A.M.; Machado, J.T. Fractional order models of leaves. J. Vib. Control. 2014, 20, 998–1008. [Google Scholar] [CrossRef]
- Hilfer, R. Application of Fractional Calculus in Physics; World Scientific: Singapore, 2000. [Google Scholar]
- Tarasov, V. Fractional Dynamics: Applications of Fractional Calculus to Dynamics of Particles, Fields and Media; Springer: New York, NY, USA, 2010. [Google Scholar]
- Parsa, B.; Dabiri, A.; Machado, J.A.T. Application of Variable order Fractional Calculus in Solid Mechanics. In Handbook of Fractional Calculus with Applications: Applications in Engineering, Life and Social Sciences, Part A; Baleanu, D., Lopes, A.M., Eds.; De Gruyter: Berlin, Germany, 2019; Volume 7, pp. 207–224. [Google Scholar]
- Lopes, A.M.; Machado, J.A.T. Fractional-order modeling of electro-impedance spectroscopy information. In Handbook of Fractional Calculus with Applications: Applications in Engineering, Life and Social Sciences, Part A; Baleanu, D., Lopes, A.M., Eds.; De Gruyter: Berlin, Germany, 2019; Volume 7, pp. 21–41. [Google Scholar]
- Valério, D.; Ortigueira, M.; Machado, J.T.; Lopes, A.M. Continuous-time fractional linear systems: Steady-state behaviour. In Handbook of Fractional Calculus with Applications: Applications in Engineering, Life and Social Sciences, Part A; Petráš, I., Ed.; De Gruyter: Berlin, Germany, 2019; Volume 6, pp. 149–174. [Google Scholar]
- Tarasov, V.E. On history of mathematical economics: Application of fractional calculus. Mathematics 2019, 7, 509. [Google Scholar] [CrossRef]
- Clausius, R. The Mechanical Theory of Heat: With Its Applications to the Steam-Engine and to the Physical Properties of Bodies; Van Voorst, J., Ed.; Creative Media Partners: Sacramento, CA, USA, 1867. [Google Scholar]
- Boltzmann, L. Vorlesungen über die Principe der Mechanik; Barth, J.A., Ed.; Nabu Press: Charleston, SC, USA, 1897; Volume 1. [Google Scholar]
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef]
- Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev. 1957, 106, 620. [Google Scholar] [CrossRef]
- Ortigueira, M.D.; Machado, J.T. What is a fractional derivative? J. Comput. Phys. 2015, 293, 4–13. [Google Scholar] [CrossRef]
- Valério, D.; Trujillo, J.J.; Rivero, M.; Machado, J.T.; Baleanu, D. Fractional calculus: A survey of useful formulas. Eur. Phys. J. Spec. Top. 2013, 222, 1827–1846. [Google Scholar] [CrossRef]
- Lopes, A.M.; Tenreiro Machado, J.; Galhano, A.M. Multidimensional Scaling Visualization Using Parametric Entropy. Int. J. Bifurc. Chaos 2015, 25, 1540017. [Google Scholar] [CrossRef]
- Landsberg, P.T.; Vedral, V. Distributions and channel capacities in generalized statistical mechanics. Phys. Lett. A 1998, 247, 211–217. [Google Scholar] [CrossRef]
- Beck, C. Generalised information and entropy measures in physics. Contemp. Phys. 2009, 50, 495–510. [Google Scholar] [CrossRef]
- Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
- Kaniadakis, G. Statistical mechanics in the context of special relativity. Phys. Rev. E 2002, 66, 056125. [Google Scholar] [CrossRef]
- Naudts, J. Generalized thermostatistics based on deformed exponential and logarithmic functions. Phys. A Stat. Mech. Appl. 2004, 340, 32–40. [Google Scholar] [CrossRef]
- Abe, S.; Beck, C.; Cohen, E.G. Superstatistics, thermodynamics, and fluctuations. Phys. Rev. E 2007, 76, 031102. [Google Scholar] [CrossRef]
- Sharma, B.D.; Mittal, D.P. New nonadditive measures of entropy for discrete probability distributions. J. Math. Sci. 1975, 10, 28–40. [Google Scholar]
- Wada, T.; Suyari, H. A two-parameter generalization of Shannon–Khinchin axioms and the uniqueness theorem. Phys. Lett. A 2007, 368, 199–205. [Google Scholar] [CrossRef]
- Bhatia, P. On certainty and generalized information measures. Int. J. Contemp. Math. Sci. 2010, 5, 1035–1043. [Google Scholar]
- Asgarani, S. A set of new three-parameter entropies in terms of a generalized incomplete Gamma function. Phys. A Stat. Mech. Appl. 2013, 392, 1972–1976. [Google Scholar] [CrossRef]
- Hanel, R.; Thurner, S. A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions. EPL (Europhys. Lett.) 2011, 93, 20006. [Google Scholar] [CrossRef]
- Sharma, B.D.; Taneja, I.J. Entropy of type (α, β) and other generalized measures in information theory. Metrika 1975, 22, 205–215. [Google Scholar] [CrossRef]
- Kaniadakis, G. Maximum entropy principle and power-law tailed distributions. Eur. Phys. J. B-Condens. Matter Complex Syst. 2009, 70, 3–13. [Google Scholar] [CrossRef]
- Tarasov, V.E. Lattice model with power-law spatial dispersion for fractional elasticity. Cent. Eur. J. Phys. 2013, 11, 1580–1588. [Google Scholar] [CrossRef]
- Nigmatullin, R.; Baleanu, D. New relationships connecting a class of fractal objects and fractional integrals in space. Fract. Calc. Appl. Anal. 2013, 16, 911–936. [Google Scholar] [CrossRef]
- Lin, J. Divergence measures based on the Shannon entropy. IEEE Trans. Inf. Theory 1991, 37, 145–151. [Google Scholar] [CrossRef]
- Cover, T.M.; Thomas, J.A. Entropy, relative entropy and mutual information. Elem. Inf. Theory 1991, 2, 1–55. [Google Scholar]
- Ebrahimi, N.; Pflughoeft, K.; Soofi, E.S. Two measures of sample entropy. Stat. Probab. Lett. 1994, 20, 225–234. [Google Scholar] [CrossRef]
- Pincus, S.M. Approximate entropy as a measure of system complexity. Proc. Natl. Acad. Sci. USA 1991, 88, 2297–2301. [Google Scholar] [CrossRef] [PubMed]
- Bandt, C.; Pompe, B. Permutation entropy: A natural complexity measure for time series. Phys. Rev. Lett. 2002, 88, 174102. [Google Scholar] [CrossRef] [PubMed]
- Pan, Y.; Chen, J.; Li, X. Spectral entropy: A complementary index for rolling element bearing performance degradation assessment. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2009, 223, 1223–1231. [Google Scholar] [CrossRef]
- Fan, J.L.; Ma, Y.L. Some new fuzzy entropy formulas. Fuzzy Sets Syst. 2002, 128, 277–284. [Google Scholar] [CrossRef]
- Rosso, O.A.; Blanco, S.; Yordanova, J.; Kolev, V.; Figliola, A.; Schürmann, M.; Başar, E. Wavelet entropy: A new tool for analysis of short duration brain electrical signals. J. Neurosci. Methods 2001, 105, 65–75. [Google Scholar] [CrossRef]
- De Oliveira, E.C.; Tenreiro Machado, J.A. A review of definitions for fractional derivatives and integral. Math. Probl. Eng. 2014, 2014, 238459. [Google Scholar] [CrossRef]
- Sousa, J.V.D.C.; de Oliveira, E.C. On the ψ-Hilfer fractional derivative. Commun. Nonlinear Sci. Numer. Simul. 2018, 60, 72–91. [Google Scholar] [CrossRef]
- Katugampola, U.N. Correction to “What is a fractional derivative?” by Ortigueira and Machado [Journal of Computational Physics, Volume 293, 15 July 2015, Pages 4–13. Special issue on Fractional PDEs]. J. Comput. Phys. 2016, 321, 1255–1257. [Google Scholar] [CrossRef]
- Tarasov, V.E. No nonlocality. No fractional derivative. Commun. Nonlinear Sci. Numer. Simul. 2018, 62, 157–163. [Google Scholar] [CrossRef]
- Abdelhakim, A.A.; Machado, J.A.T. A critical analysis of the conformable derivative. Nonlinear Dyn. 2019, 95, 3063–3073. [Google Scholar] [CrossRef]
- Lenzi, E.; Mendes, R.; Da Silva, L. Statistical mechanics based on Rényi entropy. Phys. A Stat. Mech. Appl. 2000, 280, 337–345. [Google Scholar] [CrossRef]
- Parvan, A.; Biró, T. Extensive Rényi statistics from non-extensive entropy. Phys. Lett. A 2005, 340, 375–387. [Google Scholar] [CrossRef]
- Plastino, A.; Casas, M.; Plastino, A. A nonextensive maximum entropy approach to a family of nonlinear reaction–diffusion equations. Phys. A Stat. Mech. Appl. 2000, 280, 289–303. [Google Scholar] [CrossRef]
- Frank, T.; Daffertshofer, A. H-theorem for nonlinear Fokker–Planck equations related to generalized thermostatistics. Phys. A Stat. Mech. Appl. 2001, 295, 455–474. [Google Scholar] [CrossRef]
- Abe, S. A note on the q-deformation-theoretic aspect of the generalized entropies in nonextensive physics. Phys. Lett. A 1997, 224, 326–330. [Google Scholar] [CrossRef]
- Khinchin, A.I. Mathematical Foundations of Information Theory; Dover: New York, NY, USA, 1957. [Google Scholar]
- Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Urbana, IL, USA, 1963. [Google Scholar]
- Lesche, B. Instabilities of Rényi entropies. J. Stat. Phys. 1982, 27, 419–422. [Google Scholar] [CrossRef]
- Gell-Mann, M.; Tsallis, C. Nonextensive Entropy: Interdisciplinary Applications; Oxford University Press: Oxford, UK, 2004. [Google Scholar]
- Amigó, J.M.; Balogh, S.G.; Hernández, S. A brief review of generalized entropies. Entropy 2018, 20, 813. [Google Scholar] [CrossRef]
- Namdari, A.; Li, Z. A review of entropy measures for uncertainty quantification of stochastic processes. Adv. Mech. Eng. 2019, 11, 1687814019857350. [Google Scholar] [CrossRef]
- Abe, S. Nonextensive statistical mechanics of q-bosons based on the q-deformed entropy. Phys. Lett. A 1998, 244, 229–236. [Google Scholar] [CrossRef]
- Jackson, F.H. On q-functions and a certain difference operator. Earth Environ. Sci. Trans. R. Soc. Edinb. 1909, 46, 253–281. [Google Scholar] [CrossRef]
- Akimoto, M.; Suzuki, A. Proposition of a New Class of Entropy. J. Korean Phys. Soc. 2001, 38, 460–463. [Google Scholar]
- Abramowitz, M.; Stegun, I.A. (Eds.) Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables; Dover: New York, NY, USA, 1965. [Google Scholar]
- Ubriaco, M.R. Entropies based on fractional calculus. Phys. Lett. A 2009, 373, 2516–2519. [Google Scholar] [CrossRef]
- Yu, S.; Huang, T.Z.; Liu, X.; Chen, W. Information measures based on fractional calculus. Inf. Process. Lett. 2012, 112, 916–921. [Google Scholar] [CrossRef]
- Radhakrishnan, C.; Chinnarasu, R.; Jambulingam, S. A Fractional Entropy in Fractal Phase Space: Properties and Characterization. Int. J. Stat. Mech. 2014, 2014, 460364. [Google Scholar] [CrossRef]
- Wang, Q.A. Extensive generalization of statistical mechanics based on incomplete information theory. Entropy 2003, 5, 220–232. [Google Scholar] [CrossRef]
- Machado, J.T. Fractional Order Generalized Information. Entropy 2014, 16, 2350–2361. [Google Scholar] [CrossRef]
- Bagci, G.B. The third law of thermodynamics and the fractional entropies. Phys. Lett. A 2016, 380, 2615–2618. [Google Scholar] [CrossRef]
- Jalab, H.A.; Subramaniam, T.; Ibrahim, R.W.; Kahtan, H.; Noor, N.F.M. New Texture Descriptor Based on Modified Fractional Entropy for Digital Image Splicing Forgery Detection. Entropy 2019, 21, 371. [Google Scholar] [CrossRef]
- Yang, X.J. Advanced Local Fractional Calculus and Its Applications; World Science Publisher: New York, NY, USA, 2012. [Google Scholar]
- Karcı, A. New approach for fractional order derivatives: Fundamentals and analytic properties. Mathematics 2016, 4, 30. [Google Scholar] [CrossRef]
- Karcı, A. Fractional order entropy: New perspectives. Optik 2016, 127, 9172–9177. [Google Scholar] [CrossRef]
- Ferreira, R.A.; Tenreiro Machado, J. An Entropy Formulation Based on the Generalized Liouville Fractional Derivative. Entropy 2019, 21, 638. [Google Scholar] [CrossRef] [PubMed]
- Machado, J.T.; Lopes, A.M. Fractional Rényi entropy. Eur. Phys. J. Plus 2019, 134, 217. [Google Scholar] [CrossRef]
- Beliakov, G.; Sola, H.B.; Sánchez, T.C. A Practical Guide to Averaging Functions; Springer: Cham, Switzerland, 2016. [Google Scholar]
- Xu, D.; Erdogmuns, D. Renyi’s entropy, divergence and their nonparametric estimators. In Information Theoretic Learning; Springer: Berlin/Heidelberg, Germany, 2010; pp. 47–102. [Google Scholar]
- Van Eck, N.J.; Waltman, L. Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics 2010, 84, 523–538. [Google Scholar] [CrossRef] [PubMed]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).