Next Article in Journal
A New Angular Light Scattering Measurement of Particulate Matter Mass Concentration for Homogeneous Spherical Particles
Previous Article in Journal
The Use of Data from the Parkinson’s KinetiGraph to Identify Potential Candidates for Device Assisted Therapies
Previous Article in Special Issue
Analysis of Disparity Information for Depth Extraction Using CMOS Image Sensor with Offset Pixel Aperture Technique
Article Menu

Export Article

Open AccessArticle

Colour Constancy for Image of Non-Uniformly Lit Scenes

1
School of Computing, Creative Technology and Engineering, Leeds Beckett University, Leeds LS1 3HE, UK
2
Communications and Intelligent Systems Group, School of Engineering and Computer Science, University of Hertfordshire, Hatfield AL10 9EU, UK
*
Author to whom correspondence should be addressed.
This paper is extension version of the conference paper: Hussain, M.A.; Sheikh-Akbari, A. Colour Constancy for Image of Non-Uniformly Lit Scenes. In Proceedings of the 2018 IEEE International Conference on Imaging Systems and Techniques (IST), Krakow, Poland, 16–18 October 2018.
Sensors 2019, 19(10), 2242; https://doi.org/10.3390/s19102242
Received: 21 February 2019 / Revised: 9 May 2019 / Accepted: 10 May 2019 / Published: 15 May 2019
  |  
PDF [2078 KB, uploaded 15 May 2019]
  |  

Abstract

Digital camera sensors are designed to record all incident light from a captured scene, but they are unable to distinguish between the colour of the light source and the true colour of objects. The resulting captured image exhibits a colour cast toward the colour of light source. This paper presents a colour constancy algorithm for images of scenes lit by non-uniform light sources. The proposed algorithm uses a histogram-based algorithm to determine the number of colour regions. It then applies the K-means++ algorithm on the input image, dividing the image into its segments. The proposed algorithm computes the Normalized Average Absolute Difference (NAAD) for each segment and uses it as a measure to determine if the segment has sufficient colour variations. The initial colour constancy adjustment factors for each segment with sufficient colour variation is calculated. The Colour Constancy Adjustment Weighting Factors (CCAWF) for each pixel of the image are determined by fusing the CCAWFs of the segments, weighted by their normalized Euclidian distance of the pixel from the center of the segments. Results show that the proposed method outperforms the statistical techniques and its images exhibit significantly higher subjective quality to those of the learning-based methods. In addition, the execution time of the proposed algorithm is comparable to statistical-based techniques and is much lower than those of the state-of-the-art learning-based methods. View Full-Text
Keywords: charge-coupled device sensor; colour constancy; multi-illuminants; k-means segmentation; fusion charge-coupled device sensor; colour constancy; multi-illuminants; k-means segmentation; fusion
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Hussain, M.A.; Sheikh-Akbari, A.; Mporas, I. Colour Constancy for Image of Non-Uniformly Lit Scenes. Sensors 2019, 19, 2242.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top