Next Article in Journal
Rapid and Automated Approach for Early Crop Mapping Using Sentinel-1 and Sentinel-2 on Google Earth Engine; A Case of a Highly Heterogeneous and Fragmented Agricultural Region
Previous Article in Journal
Characterization of COVID-19-Related Lung Involvement in Patients Undergoing Magnetic Resonance T1 and T2 Mapping Imaging: A Pilot Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Color Image Enhancement Focused on Limited Hues

1
Department of Culture and Creative Arts, Yamaguchi Prefectural University, Yamaguchi 753-8502, Japan
2
Graduate School of Sciences and Technology for Innovation, Yamaguchi University, Yamaguchi 753-8512, Japan
*
Author to whom correspondence should be addressed.
J. Imaging 2022, 8(12), 315; https://doi.org/10.3390/jimaging8120315
Submission received: 29 September 2022 / Revised: 17 November 2022 / Accepted: 21 November 2022 / Published: 23 November 2022

Abstract

:
Some color images primarily comprise specific hues, for example, food images predominantly contain a warm hue. Therefore, these hues are essential for creating delicious impressions of food images. This paper proposes a color image enhancement method that can select hues to be enhanced arbitrarily. The current chroma is considered such that near achromatic colors are not over-enhanced. The effectiveness of the proposed method was confirmed through experiments using several food images.

1. Introduction

Color images permeate our daily lives by recording various scenes. The popularity of smartphones and other digital devices has created an environment where people can easily capture photographs. Furthermore, they can share photographs using Internet technologies, such as social networking services (SNSs).
For instance, we can obtain many food images taken in various environments, which are posted on SNSs. However, their quality may be poor because of the performance of the digital device and lighting environment. Food images appear delicious when their lightness and chroma are satisfactory. In addition, food images mainly involve warm hues, such as red, yellow, and orange. Therefore, color image enhancement that focuses on warm hues is necessary for food images. Furthermore, the hues should not have changed after the image enhancement.
Several image enhancement methods that preserve the hue in the RGB color space have been proposed to improve the quality of acquired images [1,2,3,4,5,6,7,8]. Furthermore, image enhancement methods in other color spaces, such as HSI and CIELAB color spaces, have been proposed [9,10,11,12,13,14]. These color spaces can directly represent hue, chroma, and lightness.
Our previous method [14] using the CIELAB color space was computationally expensive and real-time processing was difficult. The improved method [15,16] solved the problem of computational cost; however, its concrete applications in daily life were unclear. We focus on the limited hues that often appear in food images and propose an image enhancement method that preserves the hue in the CIELAB color space to match human visual characteristics. The novelties of the proposed method are that a weighting function with hue as a variable is introduced to achieve limited-hue image enhancement naturally and the unnatural coloring of near-achromatic colors is avoided by considering the magnitude of the current chroma. However, one limitation of the proposed method is that it does not incorporate the local features of the human visual system such as the Abney effect [17] and the Helmholtz–Kohlrausch effect [18,19]. Experiments were conducted using several digital images to verify the performance of the proposed method.
The rest of this paper is organized as follows. Section 2 explains a limited hue-focused chroma enhancement method. Section 3 provides the experimental results applying the proposed image enhancement method to some digital food images. Section 4 presents the conclusions of our study.

2. Limited Hue-Focused Chroma Enhancement in CIELAB Color Space

This study employs the CIELAB color space to represent human visual perception adequately as the color space for image enhancement processing.

2.1. Conversion from RGB Color Space to CIELAB Color Space

Generally, digital color images acquired using digital devices are represented in the RGB color space. Therefore, the conversion from RGB color components to CIELAB color components is required. The RGB components of the original image are first inverse gamma corrected and converted to the linear RGB components. Here, the RGB color space is assumed to be the sRGB color space. The linear RGB components are denoted as { Q C } C { R , G , B } . R, G, and B denote red, green, and blue, respectively. We assume that { Q C } is normalized to [ 0 , 1 ] .
Next, Q C is converted into the color components (X, Y, Z) of the CIEXYZ color space. Finally, ( X , Y , Z ) is converted to color components ( L * , a * , b * ) of the CIELAB color space. L * is the lightness ranging from 0 to 100. a * and b * are chromaticity indices in the red-green and yellow-blue directions, respectively.
Chroma C * and hue h are given by the following equations using a * and b * [20].
C * = ( a * ) 2 + ( b * ) 2 ,
h = arctan ( b * / a * ) .
Although h is generally given in the range [ 0 , 360 ] , in this study, it is treated as [ 180 , 180 ] in order to simplify the formulation. The minimum value of C * is zero and the maximum value varies depending on L * and h. Here, h is set to 0 when C * < 0.1 .

2.2. Hue-Based Weight Function for Chroma Enhancement

The weight function k is introduced to achieve chroma enhancement in a limited range of hues.
k = α exp ( ( h θ ) / 180 ) 2 β + 1 ,
where α , β , and θ are parameters that determine the shape of Gaussian function. θ is the target hue and is the center of the Gaussian function. α determines the degree of chroma enhancement. β determines the range of hues that need to be enhanced. When β is small, only hues close to θ are enhanced, and when β is larger, wider hues are enhanced. Figure 1 shows the weight function k.
Using the weight function k, the chroma enhancement while preserving the hue is performed as follows:
C k * = ( k a * ) 2 + ( k b * ) 2 = k C * .
When k is close to 1, the enhanced chroma C k * is almost the same as that of the original C * . h is preserved in the CIELAB color space from arctan ( k b * / k a * ) = arctan ( b * / a * ) . However, the RGB color components obtained by converting the enhanced CIELAB color components ( k a * , k b * , L * ) may be out of the color gamut, depending on the degree of k. In this case, the quality of the resulting image degrades.
To address the color gamut problem, the chroma C e * after enhancement is given as follows:
C e * = C k * within gamut C LU * otherwise .
where C LU * is an approximation of the maximum chroma in the color gamut, defined by L * , and h and is obtained from a lookup table using the method in Ref. [15]. Finally, the enhanced CIELAB color components are converted into RGB color components to obtain the resulting images [15]. Figure 2 shows the effects of the gamut correction using Equation (5).

2.3. Adjustment of the Degree of Chroma Enhancement Considering Current Chroma

However, as illustrated in Figure 3, there is a problem that unnatural colors are added to colors that are close to achromatic colors, such as whitish colors.
To solve this problem, function t is added to Equation (3) such that the degree of chroma enhancement is adjusted according to the magnitude of the current chroma.
k = α t ( C * / C max * ) exp ( ( h θ ) / 180 ) 2 β + 1 ,
where C max * is the maximum chroma of the original image. t is a tone-mapping function defined as follows:
t ( x ) = 0 x < MIN x MIN MAX MIN MIN x MAX 1 x > MAX .
Figure 4 shows the tone-mapping function t. The proposed method is shown in Figure 5.

3. Experimental Results

3.1. Food Image Enhancement

Experiments were conducted to illustrate the performance of the proposed method using digital food images. Parameters α , β , MIN, and MAX are set as to 3, 0.1, 0.2, and 0.8, respectively. θ is 72, which is the average value of h, using the top 100 ranking photos posted as of April 2021 on the SNS site “SnapDish”, which specializes in cooking [21]. The problem in this calculation is that the values of h differ greatly between −180 and 180, even though they have almost the same hue. Therefore, θ is determined using the following equation:
θ = arctan b * ¯ / a * ¯ ,
where < · > is the average operator for all the images. a * ¯ and b * ¯ are the average values of a * and b * in each image.
Figure 6 shows the experimental results of seven digital images. These are 24-bit color images (1) 1015 × 501 pixels, (2) 1920 × 1080 pixels, (3) 1919 × 1080 pixels, (4) 1706 × 960 pixels, (5) 1280 × 1280 pixels, (6) 1478 × 1108 pixels, and (7) 1280 × 664 pixels in size, respectively. The first column shows the original images, the second column shows the resulting images using the proposed method, and the third, fourth, and fifth columns show the resulting images using the methods in Ref. [2], in Ref. [6], and in Refs. [12,13]. In (b1), the unnatural coloring of whitish colors is improved compared with (b) in Figure 3. In (b2) and (b3), the chroma of the sweets is appropriately enhanced, whereas the blue and gray dishes remain nearly identical. In (b4) and (b5), the cake and crab are vivid, whereas the brown and white tables are largely unaffected. In (b6) and (b7), the yellow areas are effectively brightened. In addition, we can see that the proposed method provides sufficient color enhancement compared to the comparison methods. However, the resulting image of (e4) is vivid compared with (b4), which was obtained by the proposed method. In the experiments, the same parameters were used for all images. This was done to demonstrate the versatility of the proposed method. However, to further improve the performance of the proposed method, it is necessary to consider a scheme by changing the parameters based on the statistics of the input image.
The difference in hue between the resulting and original images is calculated as follows [22].
| | Δ h * | | = 2 C e * C o * sin | h e h o | 2 ,
where h e and h o are the hues of pixels paired with the resulting and the original images, respectively. Similarly, C e * and C o * represent chromas. · denotes the absolute value. Furthermore, the difference Δ C * = C e * C o * is calculated to verify the degree of image enhancement. Table 1 lists the averages and standard deviations of | | Δ h * | | and Δ C * . We can see that the averages of | | Δ h * | | in all resulting images by the proposed method are smaller than those of the comparison methods, and the hue is almost preserved. Furthermore, the averages of Δ C * indicate that the proposed method can improve the chroma compared to the comparison methods.
SSIM [23] is measured as the image quality metric for objectively evaluating the performance of the proposed method. When SSIM is closer to 1, the enhanced image is structurally similar to the original image. Table 2 shows SSIMs between the original images and the resulting images with respect to Figure 6. This value is the average of SSIM calculated for each RGB component. From Table 2, we see that the proposed method gives better results compared to the comparison methods.
Figure 7 shows the scatter plots of the hue and chroma corresponding to Figure 6. The chroma is enhanced within a limited range of hues by the proposed method. However, due to the relationship between the RGB color space and the CIELAB color space, the range of possible values for C * varies depending on L * . Therefore, there are areas where C * does not increase uniformly.
Furthermore, Scheffe’s paired comparison test [24,25] was conducted as a subjective evaluation of the resulting images in Figure 6. Randomly selected images were placed left and right, and the two images were evaluated by six examinees (average age: 20.5 years, 4 females and 2 males) about which image is preferable as a food image. The examinees evaluated the two images by selecting one of [ 2 , 1 , 0 , 1 , 2 ] , where the higher values indicate that the right image is preferable to the left image. The yardstick method was used to obtain the evaluation value of each image. Table 3 shows the evaluation values by Scheffe’s paired comparison test. A higher value indicates a higher evaluation of the image. From Table 3, we see that the proposed method is superior to the comparison methods except in the case of image 4.

3.2. Other Applications

Figure 8 shows the experimental results of the flower image. As for the values of the parameters, only θ was changed to −60 due to the color information of the original image. This figure shows that the proposed method can naturally highlight almost only the color of the blue flowers.

3.3. Computational Load

Table 4 shows the execution times required to obtain Figure 6 using CPU Intel(R) Core(TM) i5-5200U, RAM 8GB, MATLAB R2020a. The computational cost of the proposed method is relatively high because of the color space conversion involved. However, since the implementation of the proposed method is still naive, we think that further acceleration is fully possible. The Matlab code of the proposed method is available here: https://github.com/ta850-z/limited_hues_enhancement, accessed on 20 November 2022.

4. Conclusions

This paper proposed a color image enhancement method that focuses on limited hues. The degree of chroma enhancement was also adjusted by considering the magnitude of current chroma to avoid the unnatural coloring of near-achromatic colors. Applying the proposed method to actual food images confirmed that the image enhancement is effective. Further work is required to improve the proposed method by utilizing the color information of the original image.

Author Contributions

Conceptualization, T.A. and N.S.; methodology, T.A., N.S., K.K. and C.H.; software, T.A. and N.S.; validation, K.K. and C.H.; formal analysis, T.A. and N.S.; investigation, K.K. and C.H.; data curation, T.A. and C.H.; writing—original draft preparation, T.A.; writing—review and editing, T.A.; funding acquisition, T.A. and N.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by JSPS KAKENHI Grant Number 22K12097.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Naik, S.K.; Murthy, C.A. Hue-preserving color image enhancement without gamut problem. IEEE Trans. Image Process. 2003, 12, 1591–1598. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Yang, S.; Lee, B. Hue-preserving gamut mapping with high saturation. Electron. Lett. 2013, 49, 1221–1222. [Google Scholar] [CrossRef]
  3. Nikolova, M.; Steidl, G. Fast hue and range preserving histogram specification: Theory and new algorithms for color image enhancement. IEEE Trans. Image Process. 2014, 23, 4087–4100. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Kamiyama, M.; Taguchi, A. Hue-preserving color image processing with a high arbitrariness in RGB color space. IEICE Trans. Fundam. 2017, 100, 2256–2265. [Google Scholar] [CrossRef]
  5. Kinoshita, Y.; Kiya, H. Hue-correction scheme based on constant-hue plane for deep-learning-based color-image enhancement. IEEE Access 2020, 8, 9540–9550. [Google Scholar] [CrossRef]
  6. Inoue, K.; Jiang, M.; Hara, K. Hue-preserving saturation improvement in RGB color cube. J. Imaging 2021, 7, 150. [Google Scholar] [CrossRef] [PubMed]
  7. Kurokawa, R.; Yamato, K.; Hasegawa, M. Near hue-preserving reversible contrast and saturation enhancement using histogram shifting. IEICE Trans. Inf. Syst. 2022, 105, 54–64. [Google Scholar] [CrossRef]
  8. Zhou, D.; He, G.; Xu, K.; Liu, C. A two-stage hue-preserving and saturation improvement color image enhancement. algorithm without gamut problem. IET Image Process. 2022, 1–8. [Google Scholar] [CrossRef]
  9. Chien, C.L.; Tseng, D.C. Color image enhancement with exact HSI color model. Int. J. Innov. Comput. Inf. Control. 2011, 7, 6691–6710. [Google Scholar]
  10. Taguchi, A.; Hoshi, Y. Color image enhancement in HSI color space without gamut problem. IEICE Trans. Fundam. 2015, 98, 792–795. [Google Scholar] [CrossRef]
  11. Kinoshita, Y.; Kiya, H. Hue-correction scheme considering CIEDE2000 for color-image enhancement including deep-learning-based algorithms. APSIPA Trans. Signal Inf. Process. 2020, 9, 1–10. [Google Scholar] [CrossRef]
  12. Li, G.; Rana, M.A.; Sun, J.; Song, Y. Real-time image enhancement with efficient dynamic programming. Multimed. Tools Appl. 2020, 79, 1–21. [Google Scholar] [CrossRef]
  13. The Code for the Dynamic Programming Approach Developed for the Enhancement of Color Images. Available online: https://github.com/yinglei2020/YingleiSong (accessed on 15 September 2022).
  14. Azetsu, T.; Suetake, N. Hue-preserving image enhancement in CIELAB color space considering color gamut. Opt. Rev. 2019, 26, 283–294. [Google Scholar] [CrossRef]
  15. Azetsu, T.; Suetake, N. Chroma enhancement in CIELAB color space using a lookup table. Designs 2021, 5, 32. [Google Scholar] [CrossRef]
  16. The Code for Chroma Enhancement in CIELAB Color Space Using a Lookup Table. Available online: https://github.com/ta850-z/color_image_enhancement (accessed on 21 May 2021).
  17. Mizokami, Y.; Werner, J.S.; Crognale, M.A.; Webster, M.A. Nonlinearities in color coding: Compensating color appearance for the eye’s spectral sensitivity. J. Vis. 2006, 6, 283–294. [Google Scholar] [CrossRef] [PubMed]
  18. Fairchild, M.D.; Pirrotta, E. Predicting the lightness of chromatic object colors using CIELAB. Color Res. Appl. 1991, 16, 385–393. [Google Scholar] [CrossRef]
  19. Nayatani, Y. A colorimetric explanation of the Helmholtz-Kohlrausch effect. Color Res. Appl. 1998, 23, 374–378. [Google Scholar] [CrossRef]
  20. Fairchild, M.D. Color Appearance Models, 3rd ed.; Wiley: Chichester, UK, 2013. [Google Scholar]
  21. SnapDish. Available online: https://snapdish.co (accessed on 10 May 2021).
  22. CIE Publication, Colorimetry, no. 15; CIE Central Bureau: Vienna, Austria, 2004.
  23. Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [PubMed]
  24. Engeldrum, P.G. Psychometric Scaling: A Toolkit for Imaging Systems Development; Imcotek Press: Winchester, UK, 2000. [Google Scholar]
  25. Takagi, H. Practical statistical tests machine learning-III: Significance tests for human subjective tests. Inst. Syst. Control. Inf. Eng. 2014, 58, 514–520. (In Japanese) [Google Scholar]
Figure 1. Weight function k expressed by Equation (3) for α = 3 , θ = 72 , and β = 0.05 , 0.1 , 0.2 .
Figure 1. Weight function k expressed by Equation (3) for α = 3 , θ = 72 , and β = 0.05 , 0.1 , 0.2 .
Jimaging 08 00315 g001
Figure 2. Experimental results. (a) Original image. (b) Resulting image applying only Equation (3). (c) Resulting image applying Equation (5) to (b).
Figure 2. Experimental results. (a) Original image. (b) Resulting image applying only Equation (3). (c) Resulting image applying Equation (5) to (b).
Jimaging 08 00315 g002
Figure 3. Experimental results. (a) Original image. (b) Resulting image using the proposed method with Equation (3).
Figure 3. Experimental results. (a) Original image. (b) Resulting image using the proposed method with Equation (3).
Jimaging 08 00315 g003
Figure 4. Tone mapping function t expressed by Equation (7) for MIN = 0.2 and MAX = 0.8 .
Figure 4. Tone mapping function t expressed by Equation (7) for MIN = 0.2 and MAX = 0.8 .
Jimaging 08 00315 g004
Figure 5. Flow diagram of the proposed method.
Figure 5. Flow diagram of the proposed method.
Jimaging 08 00315 g005
Figure 6. Experimental results. First column (a1–a7): original images. Second column (b1–b7): resulting images using the proposed method with Equation (6). Third column (c1–c7): resulting images using the method in Ref. [2] using histogram equalization. Fourth column (d1–d7): resulting images using the method in Ref. [6] using histogram equalization. Fifth column (e1–e7): resulting images using the method in Refs. [12,13].
Figure 6. Experimental results. First column (a1–a7): original images. Second column (b1–b7): resulting images using the proposed method with Equation (6). Third column (c1–c7): resulting images using the method in Ref. [2] using histogram equalization. Fourth column (d1–d7): resulting images using the method in Ref. [6] using histogram equalization. Fifth column (e1–e7): resulting images using the method in Refs. [12,13].
Jimaging 08 00315 g006
Figure 7. Scatter plots of hue and chroma corresponding to Figure 6.
Figure 7. Scatter plots of hue and chroma corresponding to Figure 6.
Jimaging 08 00315 g007
Figure 8. Experimental results. (a) Original image and its scatter plot of hue and chroma (a1). (b) Resulting image using the proposed method with Equation (6) and its scatter plot of hue and chroma (b1).
Figure 8. Experimental results. (a) Original image and its scatter plot of hue and chroma (a1). (b) Resulting image using the proposed method with Equation (6) and its scatter plot of hue and chroma (b1).
Jimaging 08 00315 g008
Table 1. Averages and standard deviations of | | Δ h * | | and Δ C * .
Table 1. Averages and standard deviations of | | Δ h * | | and Δ C * .
| | Δ h * | | Δ C *
AveStdAveStd
Proposed method using Equation (6)(b1)0.0030.0338.18610.942
(b2)0.0010.0033.8497.585
(b3)0.0010.0235.0636.331
(b4)0.0050.0514.2027.976
(b5)0.0020.0217.44914.096
(b6)0.0040.07013.13611.835
(b7)0.0200.16010.72712.332
The method in Ref. [2] using histogram equalization(c1)0.1040.218−4.5583.517
(c2)0.0570.074−4.1243.367
(c3)0.0780.120−4.1543.315
(c4)0.3640.728−3.9828.227
(c5)0.0840.135−2.0966.152
(c6)0.2270.393−7.9976.841
(c7)0.5170.996−6.9956.180
The method in Ref. [6] using histogram equalization(d1)0.1040.218−4.5463.533
(d2)0.0540.068−4.0073.446
(d3)0.0870.118−3.9373.551
(d4)0.2660.586−0.1438.605
(d5)0.2630.5840.7647.378
(d6)0.3050.417−6.9478.024
(d7)0.3690.700−4.9437.690
The method in Refs. [12,13](e1)0.4341.329−0.9613.029
(e2)0.4111.259−1.1304.859
(e3)0.4731.335−1.4924.246
(e4)1.1122.7837.7648.496
(e5)0.0930.260−4.4495.866
(e6)0.7701.776−3.3615.128
(e7)0.4761.3270.6666.274
Table 2. SSIMs between the original images and the resulting images with respect to Figure 6.
Table 2. SSIMs between the original images and the resulting images with respect to Figure 6.
bcde
Number10.9390.7140.7140.751
20.9730.3410.3410.300
30.9730.5760.5760.480
40.9520.4680.4690.378
50.9250.6070.6030.580
60.8960.7600.7610.806
70.8810.8410.8350.848
Table 3. Evaluation values by Scheffe’s paired comparison test with respect to Figure 6.
Table 3. Evaluation values by Scheffe’s paired comparison test with respect to Figure 6.
bcde
Number11.167−0.625−0.7080.167
20.750−0.375−0.4170.042
31.042−0.250−0.125−0.667
40.250−0.875−0.2500.875
50.708−0.3750.417−0.750
60.625−0.375−0.2920.042
70.917−0.667−0.7500.500
Table 4. Execution times (s) required to obtain Figure 6.
Table 4. Execution times (s) required to obtain Figure 6.
bcde
Number11.690.250.332.71
26.250.691.1021.04
36.260.681.0121.07
45.100.600.8414.89
55.110.640.9019.47
65.000.620.8716.99
72.640.360.495.59
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Azetsu, T.; Suetake, N.; Kohashi, K.; Handa, C. Color Image Enhancement Focused on Limited Hues. J. Imaging 2022, 8, 315. https://doi.org/10.3390/jimaging8120315

AMA Style

Azetsu T, Suetake N, Kohashi K, Handa C. Color Image Enhancement Focused on Limited Hues. Journal of Imaging. 2022; 8(12):315. https://doi.org/10.3390/jimaging8120315

Chicago/Turabian Style

Azetsu, Tadahiro, Noriaki Suetake, Keisuke Kohashi, and Chisa Handa. 2022. "Color Image Enhancement Focused on Limited Hues" Journal of Imaging 8, no. 12: 315. https://doi.org/10.3390/jimaging8120315

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop