Previous Article in Journal
Fungal Microfeatures in Topsoils Under Fairy Rings in Pyrenean Grasslands
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Using a Reference Color Plate to Correct Smartphone-Derived Soil Color Measurements with Different Smartphones Under Different Lighting Conditions

1
Fredericton Research and Development Centre, Agriculture and Agri-Food Canada, Fredericton, NB E3B 4Z7, Canada
2
Department of Soil Science, University of Manitoba, Winnipeg, MB R3T 2N2, Canada
3
Department of Geography, Brandon University, Brandon, MB R7A 6A9, Canada
*
Author to whom correspondence should be addressed.
Soil Syst. 2025, 9(3), 93; https://doi.org/10.3390/soilsystems9030093
Submission received: 30 June 2025 / Revised: 19 August 2025 / Accepted: 21 August 2025 / Published: 26 August 2025

Abstract

Soil color has long been used as an indicator for soil properties such as soil organic carbon and soil moisture. Recent developments in citizen science have seen the increased use of smartphone cameras for soil color measurements. However, there are high errors associated with this technique. Two major sources of errors are smartphone cameras and lighting conditions. These errors limit the applicability of this technique in citizen science. Existing correction methods for reducing these errors are either ineffective or too complicated or difficult to apply. There is also a lack of systematic analysis on how these correction methods can reduce errors. In this study, we tested the effectiveness of using a color plate as a reference to reduce the errors on color measurements due to the use of different smartphones and taking photos under different lighting conditions. Three types of objects were tested, including the squares on the color plate itself, the color chips in a Munsell soil color book, and soil samples. The results show that the raw values of color parameters showed different patterns of biases with different smartphones and lighting conditions. The calibration reduced the errors consistently for all smartphones under all lighting conditions for the color plate squares. For the Munsell book chips or the soil samples, the calibration did not always reduce the bias but it did reduce the variations in all color parameters among smartphones and lighting conditions and, therefore, improved the precision of color measurements.

1. Introduction

Soil color is one of the most important soil morphological characteristics and is often the first property recognized or recorded by a soil scientist or a layperson [1,2]. Soil color is determined by many factors and has therefore been used as an indicator for various soil properties. The most common application of soil color is probably to estimate soil organic carbon and soil moisture [3,4,5,6]. Soil color has also been used in the studies of soil genesis, classification, texture, structure, and nutrients [7,8,9,10].
Despite the apparent connections between soil color and many important soil properties, until recently, the measurement and use of soil color have been largely descriptive and qualitative. There are several factors that contribute to this. Physically, color is determined by light reflectance. For a real-life object, light reflectance is a mixture of lights at different wavelengths. So, color is also determined by how light reflectance is perceived by human eyes. As such, the science of color is complex, spanning disciplines such as physiology, psychology, physics, chemistry, and mineralogy [1]. It has been found that human eyes have three different color-response mechanisms and, therefore, color spaces generally use three parameters to represent three color stimuli [11,12]. Popular color spaces include the RGB color systems that are widely used in electronic systems and the CIE color systems defined by the International Commission on Illumination (CIE), which are often used as reference standards [13]. In soil science, soil color is traditionally recorded using Munsell categorization, which defines the color space using three parameters: hue for the type of a color; value for the lightness of a color; and chroma for the saturation of a color [14]. In practice, soil color is determined visually and subjectively by comparing a soil sample to chips in a Munsell soil color book. There are often substantial errors associated with this method and, therefore, there is widespread perception that soil color cannot be measured accurately [4,15].
With the development of color science and spectroscopy, instruments for color measurement has evolved significantly over the past century [16]. Modern instruments have enabled soil scientists to measure soil color more precisely and accurately but more importantly, the measurement is more objective, not relying heavily on the experience and judgment of the operator. Therefore, more and more studies have been using colorimeter sensors and spectrometers for soil color data acquisition [4,5,9,17,18,19,20]. Another method for color data acquisition is image analysis. Early applications of image analysis on soil properties include air photos and photos taken using handheld digital cameras [21,22,23,24,25]. A special type handheld digital camera is smartphone camera. Smartphones are increasingly becoming a must-have portable device for everyone, and every smartphone has a camera. With the rapid development of camera capability and applications, smartphones are increasingly being used as a readily available, convenient yet powerful detection device by ordinary citizens [2]. The potential for recording and estimating various soil properties (including color) with smartphones through citizen science is tremendous.
Many studies have been conducted in the past decade to use image analysis (especially with smartphones) as an alternative to the specialized colorimeter or spectrometer for soil color measurement to further estimate other soil properties that correlate with color. For example, Gómez-Robledo et al. [26] developed an Android application that obtains the color parameters in the Munsell and CEI color spaces. The authors found that their method had lower errors than the traditional method of visual determination of soil color from a Munsell soil color book. Aitkenhead et al. [24] extracted color and image texture information from photos taken with an iPhone 2 and used a neural network model to estimate multiple soil properties (soil structure, soil texture, bulk density, pH and drainage category). Han et al. [10] developed a method to use smartphone as a color sensor for soil classification. Perry et al. [27] tested various models on using smartphone images to estimate soil organic matter and soil moisture.
Despite the success of using image analysis for color detection, camera-derived color has been reported to have a high levels of errors [2]. Two major sources of errors are the optical characteristics of the camera and the lighting conditions (illumination). Each camera has a combination of lenses and sensor which as a whole adds its unique signature to the digital color data recorded by the smartphone. Different camera settings complicate the situation even further. As a result, color parameters recorded for the same object with different cameras are not the same even if all other conditions are kept the same. Gómez-Robledo et al. [26] noted the possible impact of smartphone camera on the accuracy of the predicted soil properties. Kirillova et al. [28] found that the color prediction results from two cameras were different and such discrepancy could not be effectively calibrated by using an external standard. Yang et al. [29] tested five smartphones and found that the color detection from the five phones followed different patterns at different wavelength ranges. Another well-known source of errors for color measurement is photographic lighting (illumination). As such, the CIE published a series of standard illuminants and recommended four illuminating and viewing geometries [16].
One strategy to control these errors is to standardize the camera and the light source. For example, many studies used a single camera or smartphone for all the samples [3,6,24]. To control the lighting condition, a standard light source was often used, and the lighting setup was carefully designed [26,30]. In fact, for specialized color measuring instruments, an internal light source following these CIE standards is used [31]. This strategy may work well for scientific research, but its technical requirements and high costs significantly restrict its use, especially for measurement in the field or by untrained citizens using smartphones. Another strategy to control these errors is to convert the raw data to the true data via a calibration. Baek et al. [32] developed a method to calibrate different lighting conditions based on the illuminance and color temperature of the source light. However, the calibration functions vary with the object being measured and it does not correct the errors from different cameras [33]. A more popular method is to use an external color reference for calibration. The true values of color parameters for the color reference are measured with a reference method (e.g., a high-end spectrometer). The color reference is placed beside the target object when the photo is taken so that they will have the same errors. Based on the true values of the color reference, the color of the target object can then be corrected. One example of this strategy is the color checker card. It is often used with image postprocessing software in the photography world for color correction of a digital camera [30,34]. A less sophisticated but more common practice is to use a gray card to adjust the white balance of an image, which is believed to be able to correct the illumination difference to some degree [3,6]. However, it has been reported that using a single-color reference was not sufficient to correct the color [28].
Going one step further, Levin et al. [23] used plastic chips of different colors for calibration, which the authors claimed can correct errors associated with both camera and lighting conditions. Aitkenhead et al. [24,25] adopted this method but enhanced its applicability by replacing the plastic color chips by easily portable reference card with bands in different degrees of gray from white to black. However, none of these studies validated the effectiveness of the calibration method or quantified the improvement in color detection with the calibration. Given the complicated patterns of errors for different smartphones in different ranges of wavelength as reported by Yang et al. [29], it is likely that a handful of data points provided by these references will not be enough for the calibration for the full range the color space, although they may be better than a single-color reference. For soil color measurement, the most sophisticated calibration so far was probably the one conducted by Kirillova et al. [28]. The authors used a set of subsamples of the soil samples as the reference. Although this approach may be able to enhance the accuracy of the color detection significantly, preparing such a reference is difficult and time consuming. The applicability of the reference is also questionable as the range of soil color can vary a lot from region to region, from soil type to soil type, and even from field to field.
In summary, there is great potential for the use of smartphones as a tool in citizen science for soil color detection. This potential is limited by the errors associated with the smartphones used for taking the photo and the lighting conditions under which the photo is taken. There is still no validated method that can be easily applied to correct these errors, and no systematic analysis has been conducted on how these correction methods can actually reduce the errors. In this study, we propose the use of a color checking card (herein termed color plate) with a range of color squares, placed beside the object of interest, as the color reference to calibrate the color measurements with image analysis. The objective is to quantify the effectiveness of the proposed color calibration method for color parameters derived from different cameras under different lighting conditions on different objects, including soil samples.

2. Materials and Methods

2.1. Measuring Objects

Three different types of objects were used in this study. The first was a commercially available color plate (Spyder Checkr 24 Target Color Cards, Datacolor©, Lawrenceville, NJ, USA), designed to be used together with specialized software for postprocessing images to correct color bias caused by cameras (Figure 1). To provide correction for the whole color space, the color plate has 24 color squares that are distributed evenly in the color space. Taking advantage of such a design, the color plate squares were used as the color references for calibration in this study. They also served as a measuring object in this study following the leave-one-out cross-validation procedure, which will be described in detail in later sections.
The second objects were color chips in sheets taken from a Munsell soil color book (Pantone©, Carlstadt, NJ, USA), which is widely used for soil color determination in soil science. Seven sheets from the book were used, namely the sheets for 10R, 2.5YR, 5YR, 7.5YR, 10YR, 2.5Y, and 5Y, covering a broad range of soil colors commonly found in Canada. It should be noted that because the surface of the color chips is glossy, due to specular reflection, some color chips showed white glare in some images (Figure 1). The color detected from image analysis for these color chips obviously did not represent their true color. Therefore, these colors chips were excluded from the analysis and in total, there were 219 color chips used in the data analysis. Also, each color chip has a unique published color in the Munsell color book. These published Munsell color parameters were used as a benchmark for the validation of the reference color measurements (measured with FieldSpec 4, as described in Section 2.2).
The third objects were soil samples. These soil samples were collected from sites in three Canadian provinces, namely New Brunswick (NB), Prince Edward Island (PEI), and Manitoba (MB), with 10 samples from each site and a total of 30 soil samples. These samples were selected because they represented the range of soil colors observed in Canada, from the blackish Chernozem in the prairies to the yellowish Luvisolic and Podzolic soils in eastern Canada and the distinct red soil in PEI. At each site, the samples were taken along a transect going down the slope so that the samples also represented the soil catena along the slope, typically having a range of nutrient levels, especially soil organic carbon content. All samples were air-dried, passed through a 2 mm mesh sieve, and evenly spread out (~1 cm thick) in Petri dishes for measurement or imaging.

2.2. Reference Color Values Measured with FieldSpec 4

The color reflectance for all objects were measured with FieldSpec 4 (Malvern Panalytical, Boulder, CO, USA), a high resolution spectroradiometer. This instrument operates across a spectral range of 350–2500 nm, providing precise reflectance data that are often used to serve as a benchmark of accurate color measurements [17]. All measurements were conducted in a lab and CIE standard illuminant A was selected to minimize external light variability. During the measurement, the probe was positioned at a fixed distance of 3 cm and held at a 45-degree angle towards the object. A white reference panel was used every ten measurements to standardize the reflectance values and maintain the accuracy of the data. Three replicate measurements were taken for each object and each measurement consisted of ten spectral reflectance readings, which were averaged to improve the accuracy of the data.
The spectral reflectance values were averaged for specific wavelength ranges corresponding to the blue (450–495 nm), green (495–570 nm), and red (620–750 nm) color channels. These values were then scaled to an 8-bit format (0–255) as the three color parameters, red (R), green (G) and blue (B), for the RGB color space. The RGB values were converted to the Munsell hue (H), value (V), and chroma (C) values using the munsellinterpol (version 3.2.0) package in R [35]. An adjustment was applied to the H values to account for the circular nature of the hue scale, which represents continuous transitions between different types of colors. The R package used a 0 to 100 scale for H with the values of 0 and 100 corresponding to a red color. For a soil with an H value close to 0, a small error may result in huge difference in H value (e.g., an H value of 1 showed as 99 with an error of 2 H units). Given that red hues were common among the soil samples in this study, we applied a shift of 50 units so that the starting and ending H centered at the rarely observed blue hues, minimizing the effects of the circular scale on error analysis. This adjustment was applied to all hue values in this study.

2.3. Image Acquisition

The three types of objects were placed on a table and arranged with the color plate and one soil sample in the middle and the Munsell color book sheets on the sides (Figure 1). While there was only one soil sample in one picture, the same color plate and Munsell color book sheets were in every picture. Four smartphones were used, namely an iPhone 14 (Apple Inc., Cupertino, CA, USA), a Huawei Mate 10 (Huawei Technologies Co., Ltd., Shenzhen, China), a Samsung Galaxy S23, and a Samsung Galaxy S23 Ultra (Samsung Electronics Co., Ltd., Suwon, Republic of Korea). These smartphones were selected because they were produced in different years, by various companies, and are different models. As a result, the camera specifications for these four smartphones vary significantly, potentially influencing their color acquisition performance (Table 1).
The images were captured under six lighting conditions, including two indoor and four outdoor lighting conditions. These conditions were chosen to represent typical lighting conditions in both laboratory and field studies. The two indoor lighting conditions, Inside-Dim and Inside-Normal, were set by turning on two thirds and all of the lights in the room, respectively, to represent different lighting conditions in a lab. The four outdoor lighting conditions, Overcast-AM, Overcast-PM, Sunny-AM and Sunny-PM, were designed to represent the lighting condition at different times in a day (morning and afternoon) in typical weather conditions (overcast or sunny) for field experiments.
All images were taken from a fixed top-down perspective, with the smartphones pointed straight down and positioned approximately 30 cm above the table surface. The default mode was selected with auto-focus and auto-exposure enabled to allow each device to adjust to the lighting conditions by default, but flash and HDR modes were disabled to prevent artificial color enhancement. Each layout (thus every soil sample) was photographed three times for each lighting condition with each smartphone. A total of 2160 photos were taken. The photos were saved in image format of jpeg without further compression.

2.4. Image Processing and Color Calibration

The image was scaled, rotated, and cropped to a standardized resolution of 2821 × 3520 pixels using the GIMP software (version 2.10.38). A region of interest (ROI) was defined for each object in the photo (Figure 1). The ROI areas were 30 × 30, 5 × 5, and 300 × 300 pixel areas for the color plate squares, Munsell book chips, and soil samples, respectively, all in the center areas of the objects. For each image, there were a total of 244 ROIs, including 1 for the soil sample and 24 and 219 for the color plate squares and Munsell book chips, respectively. For each ROI, the RGB values for all pixels were extracted and averaged using the magick package (version 2.8.7) in R [36]. The RGB values were then converted to Munsell HVC values using the munsellinterpol package (version 3.2.0) in R. These RGB and HVC values were used as the raw data before the color calibration.
The color calibration was performed for each color parameter in each image separately based on the image-derived versus the FieldSpec 4-measured values for the reference color plate. The image-derived raw color parameter values were plotted against the corresponding FeildSpec4-devrived values. A linear regression model was established for each color parameter between the two sets of data for the color plate squares. The regression model for a given color parameter was then applied back to each object on the image to obtain the calibrated value (aligned to the FeildSpec4 measurements) for this color parameter. This process went through the color parameters one by one for all objects in the image. In the end, each object in the image received a set of new values, considered the calibrated (or corrected) values, for all color parameters. It should be noted that for the color plate squares, the calibration followed a leave-one-out cross-validation procedure (also called jackknife cross-validation). To calculate the calibrated value for a given square, the linear regression model was built on the data for the other 23 squares, leaving out only the square to be calculated. By doing so, potential bias due to the square in question itself being included in the regression analysis can be avoided. However, for the Munsell book chips and the soil samples, all 24 squares were used in building the regression models.

2.5. Precision and Accuracy Assessment for the Uncalibrated and Calibrated Data

All data analyses were conducted in R using the tidyverse package (version 2.0.0) [37]. For the FieldSpec 4 spectrometer data, mean, standard deviation (SD), and coefficient of variation (CV) were calculated from the three repeated measurements for each color parameter of each object. Statistical metrics (mean, minimum, maximum, percentiles, and range) were calculated for each type of objects for the above mean, SD, and CV (calculated from the three repeats) to assess the precision and accuracy of the FieldSpec 4 measurements.
To assess precision and accuracy of the smartphone image analysis data, the smartphone-derived values were plotted against the FieldSpec 4-measured values, and a linear regression model was established for these two sets of data via regression analysis. The coefficient of determination (R2) of the model and the slope of the regression line and its distance to the 1:1 line were used to assess the accuracy and precision of the smartphone-derived values. The errors of the smartphone-derived value for each color parameter of each object in each image were calculated by subtracting them by the corresponding FieldSpec 4-measured value. Means and SDs of the errors for a given object under different lighting conditions were calculated for each phone. Statistical metrics (mean, minimum, maximum, percentiles, and range) for the means and SDs among all the objects of a given type were calculated to assess the effects of lighting conditions on the accuracy and precision of the color parameters. Similarly, mean and SD of the errors for a given object were calculated for different smartphones under each lighting condition and the statistical metrics of these means and SDs among all objects of a given type were calculated to assess the effects of smartphones on the accuracy and precision of the color parameters. These analyses were performed for each color parameter separately and for the uncalibrated and calibrated smartphone-derived values separately so that the enhancement of the calibration for each individual color parameter could be quantified.
It should be noted that for the same layout of a specific soil sample (the color plate and all the Munsell book sheets were in every photo), although three images were taken with a specific smartphone under a specific lighting condition, only one image was used in the above mentioned analyses. This was done because analysis with the three images showed that the repeatability of the data derived from the three images was very high and the errors were negligible (typically less than 0.1%). By using only one image, the results reflect the citizen science scenario more realistically since a requirement to take three pictures could be a hurdle for ordinary citizens when participating in such exercises. In the same vein, for the color plate square and Munsell book chip data analysis, only photos for one soil sample should be used because the color plate and Munsell book sheets were in every photo and, as such, for the same smartphone with the same lighting condition, there were 30 photos, each for one soil sample. The results were almost identical no matter which soil sample was picked. The data presented in this manuscript were derived from a randomly selected soil sample (BR-317) from the New Brunswick site.

3. Results

3.1. The FieldSpec 4 Measurements

The FieldSpec 4 measurements were very precise, as evidenced in the low standard deviation (SD) and coefficient of variance (CV) values of the three repeated readings for all color parameters in both color spaces and all three types of objects tested in this study (Table 2). For the color plate squares, in the RGB color space, the mean SDs for the six color parameters ranged from 0.08 to 0.14 and the mean CVs ranged from 0.08% to 0.16%, while in the Munsell color space, the mean SDs for the six color parameters ranged from 0.003 to 0.021, and the mean CVs ranged from 0.03% to 0.27%. Even at the 90th percentile, the highest SD was only 0.33 and the highest CV was only 0.51% among all color parameters in both color spaces.
For the Munsell book chips and the soil samples, the SD and CV values were noticeably higher than those for the color plate squares. The higher SD and CV values for the soil samples were likely due to the higher natural variation in color of soil samples. The color of a soil sample obviously will not be as uniform as the color plate squares, which were factory-made to serve as color references. The soil samples also had rougher surfaces which may have higher noises due to shadows created by the particles. For the Munsell book chips, the higher SD and CV values may also be related to the surface condition of the chips, which were glossy and reflective, making their actual color hard to discern under varying lights (the ones with white glare had already been excluded from the analysis). In addition, the Munsell color chips were in sheets, which were hard to fit under the FieldSpec 4 for measurement. Therefore, human operational errors could be higher too. Nevertheless, the SD and CV values were still at very low levels even at the 90th percentile, with the highest SD and CV values of 1.52 and 1.54%, respectively, for the Munsell book chips and 0.66 and 0.71%, respectively, for the soil samples.
The accuracy of FieldSpec 4 and the formulas used to convert the spectral measurements to the Munsell color parameters were validated against the book values for the Munsell book chips. For hue, value and chroma, the FieldSpec 4-measured values all correlated very well with the book values, with the R2 values of 0.96, 0.97, and 0.96, respectively (Figure 2). The FieldSpec 4-measured values generally fell close to the 1:1 line in relation to the book values, and the slopes of the regression lines were all close to 1. These all indicate that the FieldSpec 4-measured values matched the book values well. However, there were noticeable discrepancies. First, for the same book values, the FieldSpec 4-measured values varied. Among the three Munsell color parameters, chroma appeared to have the highest variations which was reflected in that the data points had wider ranges for the same book values of chroma than those for the other two color parameters (Figure 2). Also, for both chroma and hue, there were slight but consistent underestimations and for value, the slope of the regression line deviated the most from the 1:1 line. These discrepancies could be due to measurement errors, distortions resulting from the conversion formulas, or errors in the Munselle soil color book.

3.2. The Color Plate Squares

The leave-one-out cross-validation with the color plate showed that when using the color plate as a reference, variations due to lighting condition and smartphone can be reduced substantially for all six color parameters examined in this study. The adjustment with the color plate can be well-illustrated using the value parameter (V) in the Munsell color space as an example. Before calibrated using the color plate, the smartphone-derived V with an iPhone 14 under the same lighting conditions correlated very well with the FieldSpec 4-measured V (R2 ranged from 0.97 to 0.99, Figure 3(Aa)). This means that the relative differences in V were reflected well in the smartphone-derived V when lighting conditions were set. However, the regression lines were quite different for different lighting conditions—some had a consistent bias (e.g., Inside-Dim), and some had a steeper slope than the 1:1 line (e.g., Overcast-AM and Overcast-PM). As a result, the smartphone-derived V varied in a wide range, even with the same phone, when data from the six lighting conditions were pooled together (Figure 3(Ac)). This means that the lighting condition introduces a large uncertainty to the smartphone-derived V.
After calibrating the smartphone-derived V using the color plate, under each lighting condition, the data points were brought closer to the 1:1 line, although the correlation coefficients remained similar or even dropped slightly (R2 ranged from 0.96 to 0.99, Figure 3(Ab)). As a result, when data from the six lighting conditions were pooled together, the variation ranges were much smaller than those of the uncalibrated data (Figure 3(Ac,Ad)). Accordingly, the correlation also greatly improved (R2 increased from 0.83 to 0.98), and the regression line also aligned much better with the 1:1 line. The improvement is also reflected well in the errors of the smartphone-derived V (compared to FieldSpec 4-measured V), where the data points were brought closer to the zero line (smaller error values) and closer to each other (smaller variation) (Figure 3(Ae,Af)).
Similar effects were observed for all other color parameters and for all other smartphones. After the calibration, the error values were reduced by an order of magnitude or more in most cases, indicating that the calibration brought the smartphone-derived values of all color parameters much closer to the FieldSpec 4-measured values (Table 3). Moreover, the standard deviations (SDs) of the errors, a measure indicating the dispersion of the data due to different lighting conditions, were also drastically reduced (by half in most cases), except for those for hue, which increased after the calibration. The narrower ranges of the errors indicate that the calibration increased the precision of the smartphone-derived values.
The effects of calibration using the color plate on different phones were similar to those on different lighting conditions described above. Using the Munsell V under Inside-Dim lighting condition as an example, for individual smartphones, before calibration, the smartphone-derived V correlated very well (R2 ranged from 0.96 to 0.98) with the FieldSpec 4-measured V, indicating that the smartphone-derived V reflected the relative differences in V very well (Figure 3(Ba)). However, there were noticeable differences among the regression lines of individual smartphones and as a result, the smartphone-derived V varied in a wide range when data from the four smartphones were pooled together (Figure 3(Bc)). The overall correlation between the smartphone-derived V and the FieldSpec 4-measured V was still strong (R2 of 0.81) but there were noticeable differences between the regression line and the 1:1 line. After calibration, data points were brought closer to the 1:1 line for individual smartphones although the correlation coefficients stayed similar (R2 ranged from 0.94 to 0.97) (Figure 3(Bb)). When data from the four smartphones were pooled together, data points were clustered much closer to the 1:1 line and to each other than the uncalibrated data (Figure 3(Bd)). The regression line had a much greater correlation coefficient (R2 of 0.96). Overall, from the perspective of errors, calibration reduced the variation in the errors (more precise) as well as the magnitude of the errors (more accurate) (Figure 3(Be,Bf)). Similar effects were observed for all other parameters and under all other lighting conditions. After the calibration, the error values were reduced dramatically (by one to two orders of magnitude in most cases) and, except for Munsell H, the SD of the errors were also reduced a lot (by half in most cases) (Table 4).
When data from all combinations of lighting condition and smartphone were pooled together, the effect of calibration was similar: it increased the correlation between the smartphone-derived V and the FieldSpec 4-measured values (for V, R2 changed from 0.84 to 0.95) and the ranges of error were also greatly reduced (Figure 4). Similar patterns can be seen for other color parameters (Figure S1). These results indicate that under variable lighting conditions using different smartphones, both the precision and accuracy of the smartphone-derived values for all color parameters (except for the precision of Munsell H) can be substantially enhanced with the calibration.

3.3. The Munsell Book Chips

For the Munsell book chips, with a given smartphone under a given lighting condition, the uncalibrated smartphone-derived values also correlated well with the FieldSpec 4-meausred values. However, the correlations were not as strong as those for the color plate squares under the same settings. Taking the Munsell V as an example again, with iPhone 14, the R2 ranged from 0.89 to 0.95 for different lighting conditions and the Munsell book chips, which was lower than those for the color plate squares (R2 ranged from 0.97 to 0.99, Figure 3(Aa)). This indicates that there were higher variations and thus lower precision with the Munsell book chips than with the color plate squares. This could be due to the material itself (the color is less uniform for the Munsell book chips than for the color plate squares) or the larger errors associated with the FieldSpec 4 measurements with the Munsell book chips, as described previously (Table 2). There was also a consistent underestimation of all color parameters, as evidenced by the means of the errors (calculated as the smartphone-derived values minus the FieldSpec 4-meausred values), which were negative with very few exceptions (Table 3). Larger errors associated with the FieldSpec 4 measurements on Munsell book chips cannot explain this systematic negative bias because the errors were not always negative (e.g., Munsell values had positive mean errors for most of the data range, Figure 2). It can be speculated that the bias was mainly due to the material differences in the objects. For example, light reflection on Munsell book chips was probably not as effective as on color plate squares, so they appeared dull under ordinary lighting compared to under the intense light of FieldSpec 4.
Similar to the color plate squares, calibration using the color plate did not enhance the correlations between smartphone-derived and FieldSpec 4-measured color parameters for a given smartphone under given lighting conditions, and the R2 stayed at the same level after the calibration (e.g., R2 range remained between 0.89 and 0.95 for Munsell V with iPhone 14 under different lighting conditions; Figure 5A and Figure S1). A major difference compared to the color plate squares was that the calibration did not reduce the error (the absolute value) for the Munsell book chips. It actually increased it for the Munsell V for iPhone 14 (Figure 5(Aa,Ab)) and for about half of the other color parameters under other conditions as well (Table 3 and Table 4). This indicates that the calibration was ineffective in removing the systematic bias of the data. However, the calibration did reduce the differences between lighting conditions and smartphones. As a result, when data from different lighting conditions or different smartphones were pooled together, the data points in the charts were clustered closer to each other after the calibration (e.g., Figure 5(Ac,Ad)). This was reflected in the SD of the errors as the means of SD of errors for the calibrated values were much lower than the uncalibrated ones with only a few exceptions (Table 3 and Table 4). The correlations between the smartphone-derived and the FieldSpec 4-measured values were also enhanced after the calibration (e.g., for Munsell V, R2 changed from 0.80, 0.78, and 0.80 to 0.90, 0.82, and 0.85 for the pooled data for iPhone 14, Inside-Dim, and the combined data of all settings, respectively; Figure 5A). Overall, the results indicate that although the calibration was ineffective in removing biases, it did enhance precision and, subsequently, the power of smartphones to distinguish the different colors of the Munsell book chips.

3.4. Soil Samples

For the soil samples, the patterns were very similar to those of Munsell book chips (Figure 5B and Figure S1). Under individual settings (smartphone and lighting), the smartphone-derived and FieldSpec 4-measured values correlated very well with each other but there were large differences between smartphones and between lighting conditions. This indicates lower precision with the soil samples than with the color plate squares which, similarly to the Munsell book chips, could be due to the material itself or the larger errors of the FieldSpec 4 measurements. Again, there was a consistent underestimation of all color parameters, and the means of errors were mostly negative values with very few exceptions (Table 3 and Table 4). This bias was attributed to the material differences between the soil samples and the color plate squares.
The effects of calibration were also similar to that with the Munsell book chips. It did not reduce the negative bias but brought the data points of different settings closer together, resulting in the same level of errors (absolute value) but lower SDs of errors than the uncalibrated data (Figure 5(Ba,Bb); Table 3 and Table 4). The correlations between the smartphone-derived and the FieldSpec 4-measured values were also enhanced drastically after the calibration (e.g., for Munsell V, R2 changed from 0.54, 0.59, and 0.55 to 0.87, 0.95, and 0.83 for the pooled data for iPhone 14, Inside-Dim, and the combined data of all settings, respectively, Figure 5B). This again confirms that although the calibration was unable to reduce the bias, it was effective in improving the precision of smartphone-derived color parameters.

4. Discussion

4.1. The Color Reference

The color plate was used as the color reference in this study. For the color plate squares, the calibration was a cross-validation. The results proved that the calibration could enhance both the precision and accuracy of the smartphone-derived color parameters when the pictures were taken with different phones or under different lighting conditions. For the other two types of objects, the Munsell book chips and the soil samples, the calibration was not able to reduce the biases but did enhance the precision of the smartphone-derived color parameters. The biases can be explained by the differences in the materials of the objects. The color plate was professionally manufactured as a color reference; therefore, the uniformity of color was at a very high level. It has a smooth but not glossy surface, which scatters light from all directions. The Munsell book is designed to be used for visual color comparison. It has a slightly glossy surface with random patterns of unevenness. Depending on the direction of the light source and the camera shooting angle, there could be patches on the chips with high and low levels of reflection (Figure 1). The soil samples had a less smooth surface. The soil particles could be affected by shadows depending on the direction of the light source. Soil is also a very different material, which reflects light very differently from color plates. Nevertheless, in most real-world applications, the relative differences in color are what really matter (e.g., using soil color to estimate soil organic carbon content). For these applications, it is not necessary to obtain the true values of the color parameters themselves. The requirement is on the high precision of relative differences between measured objects. In such cases, the color plate used in this study will work well, as it enhances the precision of the results so that there is a better chance of identifying any potential differences.
The calibration in this study was based on linear regressions between the smartphone-derived and the FieldSpec 4-measured values. However, these relationships were not always linear. In fact, in most cases, the correlations were non-linear (e.g., Figure 3(Aa,Ba)). Tests were conducted with higher degrees of polynomial equations to fit the data points. The R2 values were generally higher with these polynomial equations but extreme values carried heavier weights in determining the equations. As a result, the errors for data points within the normal range were often higher than with the linear regression. Therefore, linear regression was chosen in this study.
The data generated in this study also provide some insights as to how including a reference color in the picture may or may not work for color measurement correction. First, the distortion in color measurements due to smartphone type and lighting conditions was complicated. It was not a simple shift, as evidenced by the fact that the regression lines between the raw smartphone-derived and the FieldSpec 4-measured color parameters mostly had slopes that were different than the 1:1 line. This means that a single color reference will not work because one can only make a shift based on a single point of calibration. References with multiple colors are needed to account for the different distortions in different parts of the color space. This agrees with the conclusion drawn by several researchers [23,28]. Moreover, for each smartphone and each lighting condition, the distortions for each color parameters were unique (e.g., Figure 3(Aa,Ba)). Therefore, the distribution of the reference colors in the color space is important. They need to cover the ranges of the three stimuli that define the color space. Different gray colors, ranging from black to white, were used by Aitkenhead et al. [24], and a few bright colors were used by Levin et al. [23]; however, these would likely not work well due to how sparse the points were and how irregular the distributions were in the color space, although it may still be better than single-point color correction. The color plate used in this study was designed for color correction and the colors were evenly distributed in the ranges of each color parameter; thus, it provided the most robust corrections for the color space as a whole. Lastly, the results also showed that ideal performance can only be achieved when the target objects are the same as the color reference (the color plate squares in this study). This highlighted the importance of keeping the material and surface conditions as similar as possible to the color reference and the objects of interest, which has been suggested by several researchers [23,27,28]. A glossy surface like the Munsell book chips or a laminated color card should be avoided.

4.2. Choosing Smartphones and Lighting Conditions

Among the four smartphones tested, assessed based on the errors, without calibration, it appeared that Huawei Mate 10 performed the best and iPhone 14 did not perform as well as the other phones (Table 3). Such differences could be due to the auto-adjustment introduced in the newer smartphones with default mode. Nevertheless, after calibration, the differences disappeared and the errors and SD of errors for iPhone 14 were mostly smaller or at the same level as the other phones. Among the six lighting conditions, without calibration, it appeared that the Inside-Normal, Overcast-AM, and Overcast-PM lighting conditions yielded better results than the others (Table 4). After calibration, the differences between lighting conditions were reduced significantly, but the overall pattern still existed. This means that with the calibration, the choice of smartphone did not have noticeable effects on the results, but the choice of lighting condition did. The Inside-Normal, Overcast-AM, and Overcast-PM were the better-performing lighting conditions. The poor performance of the Inside-Dim lighting condition is an indication that sufficient lighting is required for a better result. The poor performance of the Sunny-AM and Sunny-PM lighting conditions was likely due to the contrasts between bright colors under direct sunshine and the dark shadows due to surface roughness. Such effects could be reduced by blocking direct sunlight, using an umbrella, for example.

4.3. Applications of the Calibration Method

Calibration has been proven to reduce the differences between smartphones or between lighting conditions, but it cannot eliminate errors created due to these differences. Analyses of the 30 photos (with 30 soil samples) of the same lighting conditions using the same smartphones for both the color plate squares and the Munsell book chips showed high R2 values (typically greater than 95%) between the smartphone-derived and FieldSpec 4-measured values. The errors and SDs of errors were also much lower than those with varying light conditions or smartphones. This was not a surprise as many studies have suggested that the same lighting conditions and photography device should be used for color detection through image analysis [15,23,25,30]. Therefore, to apply this method to measure soil color in a lab, it is recommended to use the same smartphone and keep the lighting condition the same as much as possible.
This study also showed the different responses of different types of objects, relating to the material differences, surface conditions, and the uniformity of the colors of the objects. Therefore, for soil color measurement, the soil samples need to be prepared in the same way. This includes the drying, grinding, and sieving processes to keep the moisture level and particle sizes of the samples the same. It also requires the surface of the sample to be smoothed in the same way so that the surface roughness of the sample is approximately the same for all samples. To further reduce the systematic biases of the smartphone-derived color parameters, for soil sample measurement, instead of the color plate used in this study, a color reference made from clay or dyed soil may be better [28]. However, these color references need to have colors that cover the full range of the different color parameters of the target objects, and each color needs to have a set of “true” values for all color parameters, which can be measured with a reference machine like FieldSpec 4.
A more valuable application of the method is for field use. The ability to correct different smartphone and lighting conditions enables ordinary citizens to participate in the data collection of soil colors at any location. To reduce errors, it is recommended to use the same smartphone under similar lighting conditions as much as possible. Concerning lighting conditions, measurements can be taken on days of similar weather (e.g., overcast) at about the same time during the day (e.g., always between 9 a.m. and 11 a.m.). If measurements have to be taken on a sunny day, direct sunlight should be avoided by blocking it using an umbrella or other tools. Moreover, the soil surface needs to be prepared so that it resembles that of the color reference.
Although the calibration method was developed to measure soil color, it should be applicable to the color measurement of other objects since the principle would be the same. For example, it may be applicable to measuring the color of water samples or plant tissue samples. Obviously, more studies are required to test the validity of this method for use in field experiments and for other objects.

5. Conclusions

Soil color is an indicator for many soil properties, and image analysis allows for color detection by ordinary citizens using their smartphone by taking pictures of the soil. However, color detected using a smartphone is strongly affected by the smartphone used and the lighting conditions when the picture is taken. The objective of this study was to develop a simple and inexpensive method that ordinary citizens can use to reduce the errors associated with smartphone and lighting conditions in the color detection process. The core of the method is to place a color plate beside the target object while taking the picture and use the color plate as a reference to calibrate color parameters derived from smartphone images. Three types of objects were tested, the color plate squares themselves, Munsell book chips, and soil samples. The results show that for the color plate squares, the calibration reduced errors consistently for all smartphones under all lighting conditions, but for the Munsell book chips and soil samples, error reductions were not as consistent. However, the calibration did reduce the variations in all color parameters with different smartphones and lighting conditions for all three objects, thus improving the precision of the color detection process.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/soilsystems9030093/s1. Figure S1: Uncalibrated (a) and calibrated (b) smartphone-derived values compared to FieldSpec 4-measured values and their associated errors (c,d) showing the effects of the calibration method on correcting both lighting conditions and smartphones for different color parameters and different types of objects.

Author Contributions

Conceptualization, S.L.; methodology, S.L., F.Z., Y.K., A.J.K., D.A.L., and M.G.; software, F.Z. and A.J.K.; validation, S.L. and F.Z.; formal analysis, F.Z. and S.L.; investigation, F.Z., Y.K., S.L., A.J.K., M.G., and D.A.L.; resources, S.L. and D.A.L.; data curation, Y.K. and F.Z.; writing—original draft preparation, S.L. and F.Z.; writing—review and editing, S.L., F.Z., Y.K., A.J.K., D.A.L., and M.G.; visualization, F.Z., S.L., and Y.K.; supervision, S.L.; project administration, S.L.; funding acquisition, S.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by an Enabling Agricultural Research and Innovation (EARI) project, “Developing a science based grass-roots level tool for systematic field evaluation of soil health in New Brunswick” (J-003242, PI: Li), managed by the province of New Brunswick via the Canadian Agricultural Partnership, as well as the Agriculture and Agri-food Canada project “Sustainability measures to monitor and analyze the environmental impact of Canadian agriculture” [project number J-002316] (PI: McDonald).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets presented in this article are not readily available because the data are part of an ongoing study. Requests to access the datasets should be directed to sheng.li@agr.gc.ca.

Acknowledgments

The authors would like to acknowledge Cory Barstow, Megan Bent, Sarah Etheridge, and Rebekah Kierstead for helping with the data collection and sample processing.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Bigham, J.; Ciolkosz, E.; Luxmoore, R. Soil Color: Proceedings of a Symposium Sponsored by Divisions S-5 and S-9 of the Soil Science Society of America in San Antonio, Texas, 21–26 October 1990; Soil Science Society of America: Madison, WI, USA, 1993. [Google Scholar]
  2. Naeimi, M.; Daggupati, P.; Biswas, A. Image-based soil characterization: A review on smartphone applications. Comput. Electron. Agric. 2024, 227, 109502. [Google Scholar] [CrossRef]
  3. Persson, M. Estimating surface soil moisture from soil color using image analysis. Vadose Zone J. 2005, 4, 1119–1122. [Google Scholar] [CrossRef]
  4. Wills, S.A.; Burras, C.L.; Sandor, J.A. Prediction of soil organic carbon content using field and laboratory measurements of soil color. Soil Sci. Soc. Am. J. 2007, 71, 380–388. [Google Scholar] [CrossRef]
  5. Liles, G.C.; Beaudette, D.E.; O’Geen, A.T.; Horwath, W.R. Developing predictive soil C models for soils using quantitative color measurements. Soil Sci. Soc. Am. J. 2013, 77, 2173–2181. [Google Scholar] [CrossRef]
  6. Fu, Y.; Taneja, P.; Lin, S.; Ji, W.; Adamchuk, V.; Daggupati, P.; Biswas, A. Predicting soil organic matter from cellular phone images under varying soil moisture. Geoderma 2020, 361, 114020. [Google Scholar] [CrossRef]
  7. Viscarra Rossel, R.A.; Cattle, S.R.; Ortega, A.; Fouad, Y. In situ measurements of soil colour, mineral composition and clay content by vis–NIR spectroscopy. Geoderma 2009, 150, 253–266. [Google Scholar] [CrossRef]
  8. Ibáñez-Asensio, S.; Marques-Mateu, A.; Moreno-Ramón, H.; Balasch, S. Statistical relationships between soil colour and soil attributes in semiarid areas. Biosyst. Eng. 2013, 116, 120–129. [Google Scholar] [CrossRef]
  9. Moritsuka, N.; Matsuoka, K.; Katsura, K.; Sano, S.; Yanai, J. Soil color analysis for statistically estimating total carbon, total nitrogen and active iron contents in Japanese agricultural soils. Soil Sci. Plant Nutr. 2014, 60, 475–485. [Google Scholar] [CrossRef]
  10. Han, P.; Dong, D.; Zhao, X.; Jiao, L.; Lang, Y. A smartphone-based soil color sensor: For soil type classification. Comput. Electron. Agric. 2016, 123, 232–241. [Google Scholar] [CrossRef]
  11. Young, T., II. The Bakerian Lecture. On the theory of light and colours. Philos. Trans. R. Soc. Lond. 1802, 92, 12–48. [Google Scholar] [CrossRef]
  12. Buchsbaum, G.; Gottschalk, A. Trichromacy, opponent colours coding and optimum colour information transmission in the retina. Proc. R. Soc. Lond. Ser. B Biol. Sci. 1983, 220, 89–113. [Google Scholar]
  13. Tkalcic, M.; Tasic, J.F. Colour spaces: Perceptual, historical and applicational background. In Proceedings of the IEEE Region 8 EUROCON 2003 Computer as a Tool, Ljubljana, Slovenia, 22–24 September 2003; Volume 301, pp. 304–308. [Google Scholar]
  14. Pendleton, R.L.; Nickerson, D. Soil colors and special Munsell soil color charts. Soil Sci. 1951, 71, 35–44. [Google Scholar] [CrossRef]
  15. Torrent, J.; Barrón, V. Laboratory measurement of soil color: Theory and practice. Soil Color 1993, 31, 21–33. [Google Scholar]
  16. Marcus, R.T. Chapter 2—The Measurement of Color. In AZimuth; Nassau, K., Ed.; North-Holland: Dutch, The Netherlands, 1998; Volume 1, pp. 31–96. [Google Scholar]
  17. Barthod, L.R.; Liu, K.; Lobb, D.A.; Owens, P.N.; Martínez-Carreras, N.; Koiter, A.J.; Petticrew, E.L.; McCullough, G.K.; Liu, C.; Gaspar, L. Selecting color-based tracers and classifying sediment sources in the assessment of sediment dynamics using sediment source fingerprinting. J. Environ. Qual. 2015, 44, 1605–1616. [Google Scholar] [CrossRef]
  18. Stiglitz, R.; Mikhailova, E.; Post, C.; Schlautman, M.; Sharp, J. Evaluation of an inexpensive sensor to measure soil color. Comput. Electron. Agric. 2016, 121, 141–148. [Google Scholar] [CrossRef]
  19. Das, B.; Chakraborty, D.; Singh, V.K.; Das, D.; Sahoo, R.N.; Aggarwal, P.; Murgaokar, D.; Mondal, B.P. Partial least square regression based machine learning models for soil organic carbon prediction using visible–near infrared spectroscopy. Geoderma Reg. 2023, 33, e00628. [Google Scholar] [CrossRef]
  20. Mouazen, A.; Karoui, R.; Deckers, J.; De Baerdemaeker, J.; Ramon, H. Potential of visible and near-infrared spectroscopy to derive colour groups utilising the Munsell soil colour charts. Biosyst. Eng. 2007, 97, 131–143. [Google Scholar] [CrossRef]
  21. Parry, J.; Cowan, W.; Heginbottom, J. Soils Studies Using Color Photos. Photogramm. Eng. 1969, 35, 44–56. [Google Scholar]
  22. Chen, F.; Kissel, D.E.; West, L.T.; Adkins, W. Field-scale mapping of surface soil organic carbon using remotely sensed imagery. Soil Sci. Soc. Am. J. 2000, 64, 746–753. [Google Scholar] [CrossRef]
  23. Levin, N.; Ben-Dor, E.; Singer, A. A digital camera as a tool to measure colour indices and related properties of sandy soils in semi-arid environments. Int. J. Remote Sens. 2005, 26, 5475–5492. [Google Scholar] [CrossRef]
  24. Aitkenhead, M.; Coull, M.; Gwatkin, R.; Donnelly, D. Automated Soil Physical Parameter Assessment Using Smartphone and Digital Camera Imagery. J. Imaging 2016, 2, 35. [Google Scholar] [CrossRef]
  25. Aitkenhead, M.; Cameron, C.; Gaskin, G.; Choisy, B.; Coull, M.; Black, H. Digital RGB photography and visible-range spectroscopy for soil composition analysis. Geoderma 2018, 313, 265–275. [Google Scholar] [CrossRef]
  26. Gómez-Robledo, L.; López-Ruiz, N.; Melgosa, M.; Palma, A.J.; Capitán-Vallvey, L.F.; Sánchez-Marañón, M. Using the mobile phone as Munsell soil-colour sensor: An experiment under controlled illumination conditions. Comput. Electron. Agric. 2013, 99, 200–208. [Google Scholar] [CrossRef]
  27. Taneja, P.; Vasava, H.K.; Daggupati, P.; Biswas, A. Multi-algorithm comparison to predict soil organic matter and soil moisture content from cell phone images. Geoderma 2021, 385, 114863. [Google Scholar] [CrossRef]
  28. Kirillova, N.; Zhang, Y.; Hartemink, A.; Zhulidova, D.; Artemyeva, Z.; Khomyakov, D. Calibration methods for measuring the color of moist soils with digital cameras. Catena 2021, 202, 105274. [Google Scholar] [CrossRef]
  29. Yang, J.; Shen, F.; Wang, T.; Luo, M.; Li, N.; Que, S. Effect of smart phone cameras on color-based prediction of soil organic matter content. Geoderma 2021, 402, 115365. [Google Scholar] [CrossRef]
  30. Liu, G.; Tian, S.; Xu, G.; Zhang, C.; Cai, M. Combination of effective color information and machine learning for rapid prediction of soil water content. J. Rock Mech. Geotech. Eng. 2023, 15, 2441–2457. [Google Scholar] [CrossRef]
  31. HunterLab. Spectrophotometer vs. Colorimeter: What’s the Difference? Available online: https://www.hunterlab.com/blog/spectrophotometer-vs-colorimeter-whats-the-difference/ (accessed on 6 June 2025).
  32. Baek, S.-H.; Park, K.-H.; Jeon, J.-S.; Kwak, T.-Y. A Novel Method for Calibration of Digital Soil Images Captured under Irregular Lighting Conditions. Sensors 2023, 23, 296. [Google Scholar] [CrossRef]
  33. Baek, S.-H.; Jeon, J.-S.; Kwak, T.-Y. Color calibration of moist soil images captured under irregular lighting conditions. Comput. Electron. Agric. 2023, 214, 108299. [Google Scholar] [CrossRef]
  34. Datacolor. Spyder Checkr User Guide. Available online: https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKEwjOzpfo8KSPAxWwvokEHTYBGWAQFnoECCMQAQ&url=https%3A%2F%2Fwww.datacolor.com%2Fspyder%2Fwp-content%2Fuploads%2F2023%2F07%2FSpyder-Checkrs-User-Guide-FINAL-EN.pdf&usg=AOvVaw3dZZxCAW22cU6QUmSTHlsD&opi=89978449 (accessed on 19 August 2025).
  35. Gama, J.; Centore, P.; Davis, G. Munsellinterpol: Interpolate Munsell Renotation Data from Hue/Chroma to CIE/RGB. R Package Version 2.6.1. 2020. Available online: https://cran.r-project.org/web/packages/munsellinterpol/index.html (accessed on 19 August 2025).
  36. Ooms, J. Magick: Advanced Graphics and Image-Processing in R, Version 2.7.4. 2023. Available online: http://CRAN.R-project.org/package=magick (accessed on 19 August 2025).
  37. Wickham, H.; Averick, M.; Bryan, J.; Chang, W.; McGowan, L.D.A.; François, R.; Grolemund, G.; Hayes, A.; Henry, L.; Hester, J. Welcome to the Tidyverse. J. Open Source Softw. 2019, 4, 1686. [Google Scholar] [CrossRef]
Figure 1. Layout of the objects in the photos. The color plate card (with 24 squares of different colors) was placed in the middle. The soil sample was in a Petri-dish placed right below the color plate card and the seven Munsell book sheets were placed around the color plate card and the soil sample.
Figure 1. Layout of the objects in the photos. The color plate card (with 24 squares of different colors) was placed in the middle. The soil sample was in a Petri-dish placed right below the color plate card and the seven Munsell book sheets were placed around the color plate card and the soil sample.
Soilsystems 09 00093 g001
Figure 2. Measurements derived from FieldSpec 4 compared to values provided in the Munsell soil color book.
Figure 2. Measurements derived from FieldSpec 4 compared to values provided in the Munsell soil color book.
Soilsystems 09 00093 g002
Figure 3. Uncalibrated (a,c,e) and calibrated (b,d,f) smartphone-derived values (y-axis) compared to FieldSpec 4-measured values (x-axis) of Munsell V (ad) and the associated errors of the smartphone-derived values (e,f) as examples to show the effects of the calibration method on correcting color parameters under (A) different lighting conditions (photos all taken with iPhone 14), and (B) with different smartphones (photos all taken under Inside-Dim lighting conditions) for color plate squares.
Figure 3. Uncalibrated (a,c,e) and calibrated (b,d,f) smartphone-derived values (y-axis) compared to FieldSpec 4-measured values (x-axis) of Munsell V (ad) and the associated errors of the smartphone-derived values (e,f) as examples to show the effects of the calibration method on correcting color parameters under (A) different lighting conditions (photos all taken with iPhone 14), and (B) with different smartphones (photos all taken under Inside-Dim lighting conditions) for color plate squares.
Soilsystems 09 00093 g003
Figure 4. Uncalibrated (a) and calibrated (b) smartphone-derived values (y-axis) compared to FieldSpec 4-measured values (x-axis) of Munsell V and the associated errors of the smartphone-derived values (c,d) as examples to show the effects of the calibration method on correcting color parameters for both lighting conditions and smartphones and color plate squares.
Figure 4. Uncalibrated (a) and calibrated (b) smartphone-derived values (y-axis) compared to FieldSpec 4-measured values (x-axis) of Munsell V and the associated errors of the smartphone-derived values (c,d) as examples to show the effects of the calibration method on correcting color parameters for both lighting conditions and smartphones and color plate squares.
Soilsystems 09 00093 g004
Figure 5. Uncalibrated (a,c,e) and calibrated (b,d,f) smartphone-derived values compared to FieldSpec 4-measured values of the Munsell value as examples to show the effects of the calibration method on correcting color parameters for lighting conditions and smartphones, respectively, and combined for (A) Munsell book chips and (B) soil samples.
Figure 5. Uncalibrated (a,c,e) and calibrated (b,d,f) smartphone-derived values compared to FieldSpec 4-measured values of the Munsell value as examples to show the effects of the calibration method on correcting color parameters for lighting conditions and smartphones, respectively, and combined for (A) Munsell book chips and (B) soil samples.
Soilsystems 09 00093 g005
Table 1. Specifications for the main camera for the four smartphones used in this study.
Table 1. Specifications for the main camera for the four smartphones used in this study.
SmartphonePixelsApertureFocal LengthSensor SizePixel SizeAutofocusStabilization
Huawei Mate 1012 MPf/1.627 mm1/2.9″1.25 µmPDAFOIS
iPhone 1412 MPf/1.526 mm1/1.7″1.9 µmdual pixel PDAFsensor-shift OIS
Samsung S2350 MPf/1.824 mm1/1.56″1.0 µmdual pixel PDAFOIS
Samsung S23 Ultra200 MPf/1.724 mm1/1.3″0.6 µmmulti-directional PDAFOIS
Table 2. Means and 90th percentile statistics (Avg = average/mean; SD = standard deviation; CV = coefficient of variance) of the three repetitions among the number of objects within each object type.
Table 2. Means and 90th percentile statistics (Avg = average/mean; SD = standard deviation; CV = coefficient of variance) of the three repetitions among the number of objects within each object type.
Color Plate Squares (n = 24)Munsell Book Chips (n = 238)Soil Samples (n = 30)
AvgSDCV (%)AvgSDCV (%)AvgSDCV (%)
Mean of the n objects
R (Red)125.00.140.16 157.60.670.43 128.00.390.35
G (Green)116.20.080.08 128.70.660.56 103.30.320.34
B (Blue)103.60.090.14 105.20.640.66 80.20.260.35
H (Hue)52.40.020.03 66.10.050.08 68.40.020.03
V (Value)5.00.000.06 5.50.030.49 4.50.010.33
C (Chroma)6.10.010.27 3.20.020.66 2.90.010.45
90th percentile of the n objects
R (Red)216.40.330.28 220.31.521.00 158.10.660.71
G (Green)182.60.190.19 191.21.381.10 134.10.520.68
B (Blue)156.50.200.19 163.31.361.25 102.00.430.66
H (Hue)84.70.030.06 73.20.080.13 70.20.040.05
V (Value)7.20.010.14 7.90.050.98 5.60.020.67
C (Chroma)11.50.020.51 6.50.041.54 4.00.020.70
Table 3. Means of the mean errors and standards deviations (SDs) of errors of the six lighting conditions among the number of objects (n) of each object type with each smartphone for the uncalibrated and calibrated data of the six color parameters (R = Red; G = Green; B = Blue; H = Hue; V = Value; and C = Chrome). Bold numbers indicate that the absolute values of the calibrated data are greater than those of the uncalibrated data).
Table 3. Means of the mean errors and standards deviations (SDs) of errors of the six lighting conditions among the number of objects (n) of each object type with each smartphone for the uncalibrated and calibrated data of the six color parameters (R = Red; G = Green; B = Blue; H = Hue; V = Value; and C = Chrome). Bold numbers indicate that the absolute values of the calibrated data are greater than those of the uncalibrated data).
UncalibratedCalibrated
RGBHVC RGBHVC
Mean (of n objects) of Mean Errors (of six lighting conditions)
Color plate squares (n = 24)
Huawei Mate 10−5.86−9.00−0.500.06−0.270.70 0.06−0.150.050.05−0.010.02
iPhone14−13.63−13.06−8.360.33−0.540.14 0.14−0.030.160.010.000.01
Samsung S234.514.8515.930.590.221.12 −0.05−0.27−0.110.11−0.010.02
S23 Ultra6.246.2219.930.150.270.90 −0.06−0.27−0.120.05−0.010.02
Munsell book chips (n = 219)
Huawei Mate 10−30.12−22.91−17.94−3.37−0.970.22 −24.69−14.63−17.23−3.00−0.700.41
iPhone14−39.64−29.54−27.930.62−1.28−0.29 −25.68−16.51−19.200.86−0.73−0.16
Samsung S2330.0919.2217.281.470.870.22 34.8623.9831.591.621.080.40
S23 Ultra27.3816.0811.48−2.080.750.32 33.6922.5331.81−1.781.030.33
Soil samples (n = 30)
Huawei Mate 10−25.56−22.7215.16−3.63−0.930.43 −20.64−11.2715.53−3.44−0.550.54
iPhone14−33.71−24.31−19.180.28−1.08−0.67 −21.04−8.53−11.74−0.21−0.40−0.46
Samsung S2311.114.120.632.860.23−0.45 16.377.7414.982.970.40−0.39
S23 Ultra6.410.397.47−3.990.04−0.62 14.335.6711.89−3.730.32−0.49
Mean (of n objects) of SD of Errors (of six lighting conditions)
Color plate squares (n = 24)
Huawei Mate 1021.6911.6913.662.660.521.59 7.034.887.042.910.201.34
iPhone1425.3922.8028.672.990.921.75 5.404.456.133.270.171.43
Samsung S2316.9413.2622.283.000.511.81 6.304.277.493.310.181.42
S23 Ultra16.8915.5126.082.980.591.88 7.034.727.113.230.191.42
Munsell book chips (n = 219)
Huawei Mate 1023.9615.6013.3411.470.671.10 13.4210.8613.8610.110.451.04
iPhone1424.8819.8919.8111.660.830.97 10.459.3912.5910.130.360.88
Samsung S2321.5717.5419.9412.380.711.28 13.7512.0914.5710.670.501.10
S23 Ultra21.1318.4123.5212.420.741.28 14.4212.3416.3610.500.501.16
Soil samples (n = 30)
Huawei Mate 1018.8213.5212.637.460.591.01 13.497.989.536.680.321.00
iPhone1424.7320.4019.939.850.890.85 10.998.1111.788.860.290.66
Samsung S2320.8417.4318.5110.870.711.10 13.1410.6510.989.230.420.89
S23 Ultra21.0618.5219.7212.380.740.94 13.2010.2612.1210.500.410.80
Table 4. Means of the mean errors and standard deviation (SD) of errors of the four smartphones among the number of objects (n) in each object type with each lighting condition for the uncalibrated and calibrated data of the six color parameters (R = Red; G = Green; B = Blue; H = Hue; V = Value and C = Chrome). Bold numbers indicate that the absolute values of the calibrated data are greater than those of the uncalibrated data).
Table 4. Means of the mean errors and standard deviation (SD) of errors of the four smartphones among the number of objects (n) in each object type with each lighting condition for the uncalibrated and calibrated data of the six color parameters (R = Red; G = Green; B = Blue; H = Hue; V = Value and C = Chrome). Bold numbers indicate that the absolute values of the calibrated data are greater than those of the uncalibrated data).
UncalibratedCalibrated
RGBHVC RGBHVC
Mean (of n objects) of Mean Errors (of four smartphones)
Color plate squares (n = 24)
Inside-Dim−18.02−21.02−13.980.63−0.83−0.41 0.13−0.070.120.02−0.010.01
Inside-Normal8.06−2.58−5.601.63−0.050.03 0.01−0.130.060.02−0.010.02
Overcast-AM−9.73−11.87−6.540.73−0.400.96 −0.03−0.20−0.040.01−0.010.01
Overcast-PM5.562.508.150.650.171.11 −0.01−0.21−0.070.02−0.010.02
Sunny-AM22.9117.1620.321.060.720.51 −0.05−0.29−0.080.04−0.010.03
Sunny-PM−21.91−0.6638.14−3.01−0.092.08 0.08−0.18−0.020.24−0.010.03
Munsell book chips (n = 219)
Inside-Dim−54.44−42.74−36.672.19−1.82−0.92 −35.81−21.74−24.681.78−1.00−0.59
Inside-Normal24.64−23.76−33.903.21−0.980.60 33.22−21.43−28.582.41−0.930.58
Overcast-AM−24.83−16.39−14.981.23−0.740.45 −16.64−7.12−7.830.61−0.400.82
Overcast-PM25.4416.9416.332.560.770.43 31.6018.9921.932.040.890.77
Sunny-AM7.244.3111.171.990.220.32 29.7221.1232.411.290.930.15
Sunny-PM−54.23−27.481.11−22.49−1.28−0.70 −31.39−26.0934.30−19.04−1.17−0.52
Soil samples (n = 30)
Inside-Dim−43.91−35.70−26.830.24−1.53−0.90 −27.73−13.60−16.32−0.21−0.63−0.53
Inside-Normal28.10−26.46−27.671.10−1.090.15 36.22−19.59−22.030.19−0.830.15
Overcast-AM−12.68−8.63−5.270.75−0.380.41 −7.031.51−0.350.24−0.040.72
Overcast-PM4.590.625.220.260.020.58 5.272.920.00−0.130.030.91
Sunny-AM8.877.212.960.500.290.36 17.8310.5820.64−0.180.490.31
Sunny-PM−34.77−13.1910.35−18.15−0.69−1.90 −14.48−10.5021.85−15.44−0.55−1.14
Mean (of n objects) of SD of Errors (of four smartphones)
Color plate squares (n = 24)
Inside-Dim22.5920.2225.381.460.881.00 6.474.007.021.670.150.83
Inside-Normal9.979.3611.531.130.350.57 5.804.4610.001.250.140.55
Overcast-AM9.1310.0014.471.010.350.69 6.534.866.261.150.150.66
Overcast-PM9.8810.0512.381.010.380.82 6.624.896.081.120.150.58
Sunny-AM12.6713.5612.931.100.490.81 8.715.997.931.230.190.77
Sunny-PM16.3115.5723.821.050.641.26 6.664.546.041.220.150.70
Munsell book chips (n = 219)
Inside-Dim15.8914.9416.402.850.620.32 13.299.6113.912.780.430.57
Inside-Normal7.564.856.311.930.210.35 10.119.0710.661.690.370.33
Overcast-AM7.119.3511.453.390.330.51 6.724.846.713.350.180.46
Overcast-PM7.987.767.502.880.290.43 6.034.906.052.670.200.35
Sunny-AM11.9111.177.322.890.430.56 6.034.479.032.520.170.47
Sunny-PM12.9011.5915.416.440.490.71 6.395.349.385.850.240.72
Soil samples (n = 30)
Inside-Dim20.6117.2617.393.260.780.26 5.433.695.482.660.120.41
Inside-Normal8.989.247.571.980.370.27 4.612.743.051.590.110.25
Overcast-AM11.2613.0813.151.480.500.36 4.564.144.221.320.170.26
Overcast-PM11.6612.2212.563.390.470.43 5.003.763.523.160.160.36
Sunny-AM18.5417.2712.781.400.680.51 6.553.014.090.940.170.45
Sunny-PM15.8115.3918.7013.720.640.355.244.043.0912.100.160.78
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, S.; Zheng, F.; Koiter, A.J.; Kupriyanovich, Y.; Lobb, D.A.; Goharrokhi, M. Using a Reference Color Plate to Correct Smartphone-Derived Soil Color Measurements with Different Smartphones Under Different Lighting Conditions. Soil Syst. 2025, 9, 93. https://doi.org/10.3390/soilsystems9030093

AMA Style

Li S, Zheng F, Koiter AJ, Kupriyanovich Y, Lobb DA, Goharrokhi M. Using a Reference Color Plate to Correct Smartphone-Derived Soil Color Measurements with Different Smartphones Under Different Lighting Conditions. Soil Systems. 2025; 9(3):93. https://doi.org/10.3390/soilsystems9030093

Chicago/Turabian Style

Li, Sheng, Fangzhou Zheng, Alexander J. Koiter, Yulia Kupriyanovich, David A. Lobb, and Masoud Goharrokhi. 2025. "Using a Reference Color Plate to Correct Smartphone-Derived Soil Color Measurements with Different Smartphones Under Different Lighting Conditions" Soil Systems 9, no. 3: 93. https://doi.org/10.3390/soilsystems9030093

APA Style

Li, S., Zheng, F., Koiter, A. J., Kupriyanovich, Y., Lobb, D. A., & Goharrokhi, M. (2025). Using a Reference Color Plate to Correct Smartphone-Derived Soil Color Measurements with Different Smartphones Under Different Lighting Conditions. Soil Systems, 9(3), 93. https://doi.org/10.3390/soilsystems9030093

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop