Jensen–Inaccuracy Information Measure

The purpose of the paper is to introduce the Jensen–inaccuracy measure and examine its properties. Furthermore, some results on the connections between the inaccuracy and Jensen–inaccuracy measures and some other well-known information measures are provided. Moreover, in three different optimization problems, the arithmetic mixture distribution provides optimal information based on the inaccuracy information measure. Finally, two real examples from image processing are studied and some numerical results in terms of the inaccuracy and Jensen–inaccuracy information measures are obtained.


Introduction
In recent decades, several researchers have studied information theory and its applications in various fields such as statistics, physics, economics, and engineering. Shannon entropy [1] is a fundamental quantity in information theory and it is defined for a continuous random variable X having probability density function (PDF) f on support X as where log denotes the natural logarithm. Throughout the paper, the support will be omitted in all the integrals. Several extensions of Shannon entropy have been considered by many researchers (see, for instance, Rényi and Tsallis entropies [2,3]) by providing one-parameter families of entropy measures. Moreover, several divergence measures based on entropy have been defined in order to measure similarity and dissimilarity between two density functions. Among them, chisquare, Kullback-Leibler, Rényi divergences, and their extensions have been introduced; for further details, see Nielsen and Nock [4], Di Crescenzo and Longobardi [5], and Van Erven and Harremos [6].
Let X and Y be two random variables with density functions f and g, respectively. Then, the Kullback-Leibler divergence [7] is defined as provided the integral exists. Because of some limitations of Shannon entropy, Kerridge [8] proposed a measure, known as the inaccuracy measure (or the Kerridge measure). Consider f and g as two probability density functions. Then, the inaccuracy measure between f and g is given by Several extensions of the inaccuracy measure have been developed, as well as Shannon entropy. For more details, see Kayal and Sunoj [9] and the references therein.
In the literature, the class of Jensen divergences has been studied in an extensive way as a general technique in developing information measures. Recently, other information measures such as the Jensen-Shannon, Jensen-Fisher, and Jensen-Gini measures have been studied as generalizations of well-known quantities. For further details, see Lin [10], Sánchez-Moreno et al. [11], and Mehrali et al. [12].
However, the link between the inaccuracy measure and the Jensen concept has remained unknown so far. Therefore, the main motivation in this paper is to present the Jensen-inaccuracy information (JII) measure and its properties. Let us remark that the introduction of JII measure is motivated by the fact that it can be expressed as mixture of well-known divergence measures, and it is close to the arithmetic-geometric divergence measure. Nevertheless, our new measure obtains better results when studying the similarity between elements in the field of image quality assessment. Furthermore, we establish some results associated with the connection between inaccuracy and Jensen-inaccuracy information measures and some other measures of discrimination such as Rényi entropy, average entropy, and Rényi divergence. Next, we show that the arithmetic mixture distribution provides optimal information under three different optimization problems based on the inaccuracy information measure. In the following, some well-known and useful information measures are recalled.
An extended version of the Shannon entropy measure for α > 0 and α = 1, is defined by Rényi [2] as Several applications of Rényi entropy have been discussed in the literature. Furthermore, the Rényi divergence of order α > 0 and α = 1 between density functions f and g is defined by The information measures in (3) and (4) become Shannon entropy and Kullback-Leibler divergence measures, respectively, when α tends to 1.
Another important diversity measure between two continuous density functions f and g is the chi-square divergence, defined as In a similar manner, we can define χ 2 (g, f ). The rest of this paper is organized as follows. In Section 2, we first introduce the Jensen-inaccuracy information (JII) measure. Then, we show that JII can be expressed as a mixture of the Kullback-Leibler divergence measures. We show that the Jensen-inaccuracy information measure has a close connection with the arithmetic-geometric divergence measure. Furthermore, we present an upper bound for the JII measure in terms of chisquare divergence measures. The (w, α)-Jensen-inaccuracy measure is also introduced in this section as an extended version of JII. We study the inaccuracy information measure for the escort and generalized escort distributions in Section 3. In Section 4, we consider the average entropy and define the average inaccuracy measure. Furthermore, some results are given in this regard. In Section 5, we show that the arithmetic mixture distribution involves optimal information under three different optimization problems in terms of the inaccuracy measure. Then, in Section 6, a real example is presented in order to study a problem in image processing, and some numerical results are presented in terms of the inaccuracy and Jensen-inaccuracy information measures. More precisely, our measure is useful for detection of similarity between images. Finally, in Section 7, concluding remarks are provided.

The Jensen-Inaccuracy Measure
In this section, we introduce the Jensen-inaccuracy measure and then provide a representation for this information measure in terms of Kullback-Leibler divergence measure. Furthermore, we explore the possible connection between Jensen-inaccuracy and arithmetic-geometric divergence measures. We also provide an upper bound for the Jenseninaccuracy measure based on chi-square divergence measure. At the end of this section, we introduce (w, α)-Jensen-inaccuracy and establish a result for this extended measure.

Definition 1.
Let f , f 0 , and f 1 be three density functions. Then, the Jensen-inaccuracy measure between f 0 and f 1 with respect to f is defined by (6) is non-negative.

Proof. From the convexity properties of
Now, by multiplying both sides of (7) by f (x) and then integrating with respect to x, we obtain as required.

The Jensen-Inaccuracy Measure and its Connection to Kullback-Leibler Divergence
Here, we provide a representation for the Jensen-inaccuracy measure in terms of Kullback-Leibler divergence measure.

Theorem 2.
A representation for the Jensen-inaccuracy measure in (6) based on mixture of Kullback-Liebler divergence measures is given by Proof. According to the definition of the Jensen-inaccuracy measure and the relations as required.
Next, we extend the definition of the Jensen-inaccuracy measure based on n + 1 density functions. Definition 2. Let X 1 , . . . , X n , and Y be random variables with density functions f 1 , . . . , f n , and f , respectively, and α 1 , . . . , α n be non-negative real numbers such that ∑ n i=1 α i = 1. Then, the Jensen-inaccuracy measure is defined as (9) can be written in terms of Kullback-Leibler divergence as Proof. From the definition of J K α ( f , f 1 , . . . , f n ) in (9), we have as required.

Connection between Jensen-Inaccuracy and Arithmetic-Geometric Divergence Measures
Now, we explore the connection between the Jensen-inaccuracy and arithmeticgeometric divergence measures. Then, we provide an upper bound for the Jensen-inaccuracy measure based on chi-square divergence measure.
Let f 0 and f 1 be two density functions. Then, the arithmetic-geometric divergence measure is defined as For more details, see Taneja [13].
In the following definition, we provide an extension of the arithmetic-geometric divergence measure to n-density functions. Definition 3. Let X 1 , . . . , X n be random variables with density functions f 1 , . . . , f n , respectively, and α 1 , . . . , α n be non-negative real numbers such that ∑ n i=1 α i = 1. Then, the extended arithmeticgeometric divergence measure is defined as and α is a brief notation to denote α 1 , . . . , α n .
In the following, we explore the connection between the Jensen-inaccuracy measure in (9) and the arithmetic-geometric divergence measure in (12).

Proof.
From the assumption f = f T and Theorem 3, we have .., f n ; α).
as required.

Remark 1.
From the inequality log x ≤ x − 1 for x > 0, it is easy to obtain an upper bound for J K α ( f T , f 1 , . . . , f n ) based on the chi-square divergence measure:

The (w, α)-Jensen-Inaccuracy Measure
Here, (p, w)−Jensen-inaccuracy is defined. Moreover, we establish a result for this extended measure.

Definition 4.
Let f , f 0 and f 1 be three density functions. Then, the (w, α)-Jensen-inaccuracy measure between f 0 and f 1 with respect to f is defined by

Theorem 5.
A representation for (w, α)-Jensen-inaccuracy measure based on Kullback-Leibler divergence is given by Proof. It can be proven in the same manner as Theorem 2.

Inaccuracy Information Measure of the Escort and Generalized Escort Distributions
The escort distribution is a baseline definition in non-extensive statistical mechanics and coding theory; it is closely related to Tsallis and Rényi entropies. For more details, see Bercher [14]. We show that the inaccuracy measure between an arbitrary density and its corresponding escort density can be expressed as a mixture of Shannon and Rényi entropies. Furthermore, another finding associated with the inaccuracy measure between a generalized escort distribution and each of its components reveals some interesting connections in terms of Kullback-Leibler and Rényi divergences.
Let f be a density function. Then, the escort density with order α > 0 associated with f is defined as Theorem 6. Let f be a density function and f α be an escort density corresponding to f . Then, for α > 0, we obtain: Proof. From the definition of inaccuracy measure between f and f α , we have which proves (ii).
Let f and g be two probability density functions. Then, the generalized escort density for α ∈ (0, 1) is defined as Theorem 7. The inaccuracy information measure between f and the generalized escort density h α is given by where D α ( f , g) is the relative Rényi entropy defined by Proof. From the definition of K( f , h α ), we derive as required.
Theorem 8. Let f 0 and f 1 be two density functions and consider the arithmetic and geometric mixture densities, respectively, as f a (x) = p f 0 (x) + (1 − p) f 1 (x) and f g (x) = Then, a lower bound for K( f a , f g ) is given by Proof. From the definition K( f a , f g ), by using the arithmetic mean-geometric mean inequality, we have as required.

Inaccuracy Measure Based on Average Entropy
Let X be a random variable with PDF f . Then, the average entropy associated with f is defined as For pertinent details, see Kittaneh et al. [15].
Theorem 9. Let f be a density function. Then, an upper bound for the Shannon entropy based on inaccuracy measure is given by (20) where f 2 is the corresponding escort distribution of the density f with order 2.
Proof. From the definition of average entropy, we have as required.
Definition 5. Let X be a random variable with density f . Then, the average inaccuracy measure is defined as Theorem 10. Let f and g be two PDFs. Then, the average inaccuracy measure between f and g, AK( f , g), can be expressed as Proof. From the definition of the average inaccuracy measure, we have as required.

Optimal Information Model under Inaccuracy Information Measure
In this section, we prove that the arithmetic mixture distribution provides optimal information under three different optimization problems associated with inaccuracy information measures. For more details on optimal information properties of some statistical distributions, one may refer to Kharazmi et al. [16] and the references therein.
Theorem 11. Let f , f 0 , and f 1 be three density functions. Then, the solution to the optimization problem is the arithmetic mixture density with mixing parameter p = 1 1+λ 0 , λ 0 > 0 is the Lagrangian multiplier, and η is a constraint associated with the optimization problem.
These values are in the set {0, . . . , L − 1}, where L represents the number of intensity values. Suppose that n k is the number of times in which the kth intensity appears in the image. Furthermore, the corresponding histogram of a digital image refers to a histogram of the pixel intensity values in the set {0, . . . , L − 1} (one may refer to Gonzalez [17]).

Non-Parametric Jensen-Inaccuracy Estimation
We now show an application of the inaccuracy and Jensen-inaccuracy measures defined in (6) to image processing. Let X 1 , . . . , X n be a random sample with probability density function f . Then, the kernel estimate of density f based on kernel function K with bandwidth h X > 0 at a fixed point x is given bŷ Similarly, the non-parametric estimate of density g with bandwidth h Y > 0, based on random sample Y 1 , . . . , Y n , is expressed aŝ For more details, see Duong et al. [18]. Upon making use of (28) and (29), the integrated non-parametric estimate of the inaccuracy and Jensen-inaccuracy measures are given, respectively, by where h 0 , h 1 and h are the corresponding bandwidths for the kernel estimations of the densities f 0 , f 1 and f , respectively. Here we use Gaussian kernel K(u) = 1 √ 2π e − u 2 2 . Next, we present two examples of image processing (two reference images including grayscale cameraman and lake images) and compute the inaccuracy and Jensen-inaccuracy information measures between the original picture and each of its adjusted versions for both cases. Figure 1 shows the original cameraman picture denoted by X and three adjusted versions of this original picture considered as Y(= X + 0.3) (increasing brightness), Z(= 2 × X) (increasing contrast), and W(= 0.5 × X + 0.5) (increasing brightness and decreasing contrast). The cameraman image includes 512 × 512 cells and the gray level of each cell has a value between 0 (black) and 1 (white). Figure 2 shows the original lake image that includes 512 × 512 cells and the level of the color gray of each cell assumes a value in the interval [0, 1] (0 for black and 1 for white). This image labeled as X and three adjusted versions of it labeled as Y(= X + 0.3) (increasing brightness), Z(= 2 × X) (increasing contrast) and W(= √ X) (gamma corrected).  Now, we first compute the inaccuracy between the original image and each of its adjusted versions. Then, we obtain the amount of the dissimilarity between each pair of the interference images with respect to original image based on the Jensen-inaccuracy information measure. For both images, we consider three interferences of the original images, as described above. For more details, see the EBImage package in R software [19].

Lake image
The extracted histograms are plotted in Figures 3 and 4, with the corresponding empirical densities for pictures X, Y, Z, and W for the cameraman and lake images, respectively.
We can see from Figures 1 and 3 that the similarity has the highest degree related to W and then to Y, whereas Z has a divergence of the highest degree with respect to X that is the original picture. Moreover, from Figures 2 and 4, the same observation is also found for the lake image and its three adjusted versions. We have presented the inaccuracy and Jensen-inaccuracy information measures (for both cameraman and lake images) for all pictures in Table 1. Therefore, the inaccuracy and Jensen-inaccuracy information measures can be considered as efficient criteria for comparing the similarity between an original picture and its adjusted versions.   According to Fan et al. [20], we have tried to observantly follow axioms 1 and 2 of axiomatic design theory, which has been proposed in recent decades; axiom 1 is about verification of the validity of designs, and according to axiom 2, one must choose the best design among several options.

Conclusions
In this paper, by considering the inaccuracy measure, we have proposed Jenseninaccuracy and (w, α)-Jensen-inaccuracy information measures. We have specifically shown that Jensen-inaccuracy is connected to the arithmetic-geometric divergence measure. Then, we have studied the inaccuracy measure between the escort distribution and its underling density. Furthermore, we have examined the inaccuracy measure between the generalized escort distribution and its components. It has been shown that these inaccuracy measures are closely connected with Rényi entropy, average entropy, and Rényi divergence. Interestingly, we have shown that the arithmetic mixture distribution provides optimal information under three different optimization problems associated with the inaccuracy measure. Finally, we have described two applications of the inaccuracy and Jensen-inaccuracy measures to image processing. We have considered three adjusted versions of the original cameraman and lake images and then have examined the dissimilarity between the original image and each of its adjusted versions for both cases.