This paper is an extended version of our paper published in the IEEE Information Theory Workshop (ITW), Cambridge, UK, 11–14 September 2016.

The explicit form of the rate-distortion function has rarely been obtained, except for few cases where the Shannon lower bound coincides with the rate-distortion function for the entire range of the positive rate. From an information geometrical point of view, the evaluation of the rate-distortion function is achieved by a projection to the mixture family defined by the distortion measure. In this paper, we consider the

The rate-distortion function,

If the SLB is not tight, the explicit evaluation of the rate-distortion function has been obtained only in limited cases [

In this paper, we consider the constrained optimization of the definition of

Operational rate-distortion results have been obtained for the uniform scalar quantization of the generalized Gaussian source under the

Let

If the conditional distribution

The parameter

If the marginal reconstruction density

From the properties of the rate-distortion function

In this paper, we focus on difference distortion measures,

Throughout this paper, we assume that the function

Let

Then, the Shannon lower bound (SLB) is defined by

The next lemma shows that the SLB is in fact a lower bound to the rate-distortion function and that the difference between them is lower bounded by the Kullback–Leibler divergence.

Let

In the information geometry, for a family of distributions

Hence, from the information geometrical viewpoint, the above lemma shows that the difference between

It is also easy to see from the lemma that the SLB coincides with

The following theorem claims that for a difference distortion measure, there is at most a unique source for which

Let

From the non-negativity of the divergence,

The tightness of the SLB for each

The tightness of the SLB at

We examine the rate-distortion trade-offs under the

For a difference distortion measure, we can assume that the

The SLB for the source (

It is well known that when

From Theorem 1, we immediately obtain the following corollary, which shows that the

In the case of

From Corollary 1, the SLB cannot be tight for all

We denote the rate-distortion function and bounds to it by indicating the parameters

We first prove the following lemma:

Let

The above lemma implies that

Thus, we obtain the following upper bound to

We also have the SLB for

Therefore, we arrive at the following theorem:

Since the upper bound is tight at

From Lemma 1 in

If the upper bound

The first part of the theorem is a corollary of Theorem 2 and Lemma 1. The second part corresponds to the case of

Since for

Since we know that if

The upper bound in Theorem 2 implies the following:

Since

The preceding corollary is well-known in the case of the squared distortion measure, while the Gaussian source has the largest rate-distortion function not only among all

As another example of a distortion measure that is not matching with the

In this section, we focus on the Laplacian source (

Upper and (Shannon) lower bounds which are accurate asymptotically as

We have shown that the generalized Gaussian distribution is the only source that can make the SLB tight for all

The author would like to thank the anonymous reviewers for their helpful comments and suggestions. This work was supported in part by JSPS grants 25120014, 15K16050, and 16H02825.

The author declares no conflict of interest.

Rate-distortion function

Distortion-rate bounds for the Laplacian source with