Next Article in Journal
Empirical Estimation of Nutrient, Organic Matter and Algal Chlorophyll in a Drinking Water Reservoir Using Landsat 5 TM Data
Next Article in Special Issue
Comparing Two Different Development Methods of External Parameter Orthogonalization for Estimating Organic Carbon from Field-Moist Intact Soils by Reflectance Spectroscopy
Previous Article in Journal
A New Approach for Cylindrical Steel Structure Deformation Monitoring by Dense Point Clouds
Previous Article in Special Issue
Strategies for the Development of Spectral Models for Soil Organic Matter Estimation
 
 
Article
Peer-Review Record

Can Agricultural Management Induced Changes in Soil Organic Carbon Be Detected Using Mid-Infrared Spectroscopy?

Remote Sens. 2021, 13(12), 2265; https://doi.org/10.3390/rs13122265
by Jonathan Sanderman 1, Kathleen Savage 1,*, Shree R. S. Dangal 1, Gabriel Duran 1, Charlotte Rivard 1, Michel A. Cavigelli 2, Hero T. Gollany 3, Virginia L. Jin 4, Mark A. Liebig 5, Emmanuel Chiwo Omondi 6, Yichao Rui 7 and Catherine Stewart 8
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Reviewer 3: Anonymous
Reviewer 4: Anonymous
Remote Sens. 2021, 13(12), 2265; https://doi.org/10.3390/rs13122265
Submission received: 16 April 2021 / Revised: 28 May 2021 / Accepted: 5 June 2021 / Published: 9 June 2021

Round 1

Reviewer 1 Report

Review of Remote Sensing-1206452 – “Can agricultural management induced changes in soil organic carbon be detected using mid-infrared spectroscopy. The manuscript presents valuable and timely information through a scientifically sound approach. The manuscript in general is well-written and of adequate detail. The comparisons across various study sites require statistical analyses, which the authors do commendably.

 

That stated two additional statistical analyses would aid in result evaluation and conclusion development. First, the means in Table S3 should be evaluated by a paired T test. Secondly, a concordance correlation coefficient (CCC) such as Lin’s concordance coefficient(Lawrence and Lin, 1989) should be used to compare the regression lines of the observed vs the EML estimated values. McBride (2005) provides criteria for evaluating the degree of agreement between data sets that a CCC value represents.

 

Another weak aspect in the manuscript is limited clarity on sample numbers across sites and values used in the statistical analyses. The stat analyses appear valid. It is, however, difficult to follow the listed DF values.

 

The KBS site differs in original sampling strategy, which alters the ANOVA. Additional discussion and clarity are needed on the derivation and evaluation of sample numbers at the KBS site and how it fits in the overall samples evaluated. More specific information across all sites on sample numbers at each depth increment, plot size, and replicate number would add key details, clarify stat results, and help support study conclusions. A few labelling and/or typo errors exist in the figures and tables. Specific comments follow that are referenced to page and line number.

 

The manuscript is appropriate for publication in Remote Sensing. A minor revision is needed to address the additional recommended statistical analyses. The outcome of these analyses may alter strength of conclusions. Minor corrections are needed for text errors, and to sharpen a few key details.

 

Detailed Comments:

 

 

Abstract: Among the key limitations to building credible soil carbon sequestration programs is the cost of measuring soil carbon change. Diffuse reflectance spectroscopy (DRS) is a feasible low-cost alternative to traditional laboratory analysis of soil organic carbon (SOC). Whereas numerous studies have shown that DRS can produce accurate and precise estimates of SOC across landscapes, whether DRS can detect subtle management induced changes in SOC at a given site has not been answered. Here, we leverage archived soil samples from seven long-term research trials in the U.S. to test this question using mid infrared (MIR) spectroscopy coupled with the USDA-NRCS Kellogg Soil Survey Laboratory MIR spectral library. Overall, MIR-based estimates of SOC% were excellent with the root mean square error ranging from 0.10 to 0.33% across the seven sites. In all but two instances, the same statistically significant (p < 0.10) management effect was found using both the lab-based SOC% and MIR estimated SOC% data. Despite slightly greater uncertainty in MIR-estimates of SOC%, our results suggest that large existing MIR spectral libraries based upon measured SOC values can be operationalized in other laboratories for successful carbon monitoring

 

Page 1 Line 1

There is substantial political and scientific momentum …

 

Page 3

Add a figure that shows LTR locations in USA. A USA outline with states and sites within states would suffice.

 

Table 1

The ID and Treatment entries overlap in the Rodale treatment column.

The Beltsville ID symbol for 2-year rotation is misaligned offsetting subsequent entries.

 

Page 4,

Table 1 should incorporate the following:

  1. Sample numbers by depth increment for each site.
  2. plot size and replicate numbers used at each site.

 

The sample numbers collected at stated depths is unclear. Table 1 data shows that Lincoln REAP and TCSE, Mandan, and Beltsville have samples collected to 150, 150, 60, and 50 cm respectively. As an example, the Lincoln-REAP site has sample depths of 0-7.5, 7.5-15, and 15-30 cm. The full sample range is 0-150 cm. How many samples were collected at each 0-7.5, 7.5-15, 15-30, and 30-150 cm depth increments? Deeper samples will contain smaller quantities of C and are less likely to be modified during the duration of the ARS studies. Carbon values of deeper horizons can influence study results, but C in deeper horizons (e.g., > 50 cm) are unlikely to be monitored for carbon credits. Thus, it is valuable to convey the sample numbers collected at all depth increments.

 

Sample numbers and the number of replicates at the KBS site require increased explanation. The repeated measures ANOVA, slope comparisons, and box plots employ different sample numbers than the other sites. It is unclear how the degrees of freedom (df) derive because clarity on sample number is limited.

 

Spatial variability in OC values presents another inherent challenge for carbon monitoring regardless of the how carbon is measured. Plot size and sample numbers together would provide a gauge of sample density each site.

 

Line 122

All soil samples (1855) used this in this …

 

Line 122 and Tables S1 and S3

The sample number stated in line 122 – 1855 does not match the 1377 total listed in Table S3 for observed and EML carbon values. The difference is 478 samples. Why this discrepancy? Added explanation is needed.

 

Table S1 and Table S3

The n value (168) in Table S1 for Pendleton differs from the n value (264) for Pendleton in Table S3. Which value is the correct? Or explain the discrepancy, if the values differ.

 

Table S4

Table S4 lists a n value of 362 for Beltsville. Table S1 and Table S3 both show a Beltsville n value of 390. Which value is correct? Which n value was used in the summary stat calculation? Clarification is needed. Additional explanation may be needed dependent upon answers to these questions.

 

Line 132

How was the KSSL 240-subset selected? What criteria were employed? Or was the selection random? Did the subset include all sites? Need greater explanation.

 

Page 6

 

Line 225  

effect size

 

 

Line 230

Why were slopes tested only for the KBS site? Is the slope analysis simply an additional evaluation? Or was there a specific reason to evaluate the KBS site?

 

Line 240

The EML predicted SOC% at a slightly larger range (0.04-6.54%) across all trials (Table S3) compared to corresponding observed measurements (0.02-6.40%).

 

Page 7

 

Table S3

The data and analysis in Table S3 is key central focus and goal of the manuscript it should be included in the manuscript itself not in the supplementary documents.

 

At all sites and in the summary data, the Obs mean values are greater than the EML mean values. A paired T test of the means would determine if the means are statistically different.

 

 

Figure 2

The labeled regressions for (g) and (h) appear reversed.  Graph (g) matches data from Rodale and graph (h) matches Lincoln-TSCE data

 

On screen, Figure 2 is slightly blurred. What is resolution of the figure?

Also, the vertical scale differs (shorter) from the horizontal scale, even though values are of the same variable. The scale differences complicate value comparisons and line slopes. Equivalent horizontal and vertical scales would streamline visual comparison but increase figure size. This reviewer recommends increasing Figure 2 resolution and creating graphs with equal axes scale. Also, site names could be directly labeled on each graph, as sufficient space is available.   

 

Line 248

Mandan, Lincoln-REAP and Lincoln-TCSE) while whereas Beltsville and Pendleton had slopes > 1 (Table S4).

 

Line 263

dropping decreasing from 0.95 to 0.91.

 

Page 8

 

Line 276

However, there were two cases (Lincoln Reap and Beltsville)….

 

Line 286.

The EML detects the differences, however the mean values are lower at all depths.

 

Line 287

Table S5 gives 144 df for EML OC and Figure S1 plots (visual count) about that same number. The discussion does not state the source of the additional samples and data points. Table S3 lists a n of 28 for KBS. More explanation is needed about the additional 116 spectroscopy measurements. It is unclear how the extra data points and samples were derived. The reason for the added samples stems from the way replicates were sampled. This needs greater explanation.

 

The Box plot in S1 that includes the additional KBS data shows a much greater range for the spectroscopy data compared to the observed OC.  This suggests that OC has considerable spatial variability within the study plots that is not captured in the observed data. The increased number of measurements afforded by MIR based estimates can provide greater resolution on spatial variability even though precision is slightly less than high temperature combustion.

 

Line 293

Reduced-input and never-tilled treatments (T3, T8, Figure 4f) observed and EML-predicted

 

Supplemental Documents

 

Figure S4

In caption “(d)” should be (b)

 

Page 10

 

Figure 4 The KBS observed (measured) and predicated data plotted in this figure show about 10 points in each of the six graphs, which yields roughly 60 samples. Table S3 and page 4 line160 state that the KBS site had 28 samples. This point was made previously. Added explanation for the additional data points is needed somewhere in the manuscript.

 

Line 327

Differences among treatments at Lincoln TCSE within years 1999 and 2012 (Figure S3) were detected. Figure S3 shows year as 2011.

 

Page13

 

Line 426

Spectroscopy-based estimates of SOC% (Figure 4) (Figure 6). These estimates aligned with previous investigations

 

Line 452

The reduced cost and increased throughput afforded by soil spectroscopy can allow for greater larger sample numbers sizes

 

References

References 1, 3 11, 15, 21, and 25 list first author name only followed by et al. A full author list should be included for each of these references.

 

 

References cited in review comments

 

Lawrence, I. and Lin, K. 1989. A concordance correlation coefficient to evaluate reproducibility. Biometrics, 255-268.

 

McBride, G.B., 2005. A proposal for strength-of-agreement criteria for Lin’s concordance correlation coefficient

Author Response

We thank Reviewer 1 for their comments and have addressed them below (italics). 

 

Comments and Suggestions for Authors

Review of Remote Sensing-1206452 – “Can agricultural management induced changes in soil organic carbon be detected using mid-infrared spectroscopy. The manuscript presents valuable and timely information through a scientifically sound approach. The manuscript in general is well-written and of adequate detail. The comparisons across various study sites require statistical analyses, which the authors do commendably.

 That stated two additional statistical analyses would aid in result evaluation and conclusion development.

First, the means in Table S3 should be evaluated by a paired T test.

We thank the reviewer for these suggestions and have added a paired t-test to compare mean observed vs mean EML for each trial across all years and all depthsWe do find that there are significant differences in mean values for Beltsville, Pendleton, REAP and TCSE but not for KBS and Mandan trials. After pondering the implications of these consistent differences in mean prediction versus observed, we have calculated bias and have included that statistic in Table 2 (formerly Table S4) and Table 3 (formerly Table 2) so this statistic can be viewed side-by-side with the other goodness-of-fit metrics. It is pretty apparent that the EML predictions are under-predicting but primarily at low SOC values (< 1%).

As discussed more below, we have used these findings coupled with the CCC data to paint a more complex picture of the quality of the EML predictions in this discussion section. In particular we highlight the difference between predictions on the primarily instrument (bias = 0.02) versus secondary instrument (bias = 0.25) suggesting that we need to improve the calibration transfer procedure.

Further, we became interested in understanding how much the bias was impacting the interpretation of the agronomic trial results, so we added a new Figure which plots the difference in SOC% between highest SOC treatments and the other treatments at each trial using the observed SOC% data and the EML predictions. This new figure illustrates that the pattern of change was preserved and that there was only a small bias towards underestimation of the magnitude of change between treatments. This figure has been added as Figure 7 along with a paragraph in both the results and discussion sections.     

Secondly, a concordance correlation coefficient (CCC) such as Lin’s concordance coefficient(Lawrence and Lin, 1989) should be used to compare the regression lines of the observed vs the EML estimated values. McBride (2005) provides criteria for evaluating the degree of agreement between data sets that a CCC value represents.

We thank the reviewer for this suggestion and we have added the CCC to Table 2 and Table 3, as an additional column CCC. We have added the statistical methods to the 2.4 Statistical Analysis section (line 235).

We disagree with how to interpret the CCC results. The criteria of McBride (2005) are not appropriate for spectroscopy. That was a study comparing the same established analytical technique between similar instruments. In soil spectroscopy, CCC interpretation is highly variable. For example, Singh et al. (2019, https://doi.org/10.1016/j.catena.2019.104185) considered values > 0.8 to be substantial, 0.8 > CCC > 0.7 to be moderate and CCC < 0.7 to be poor. Additionally, qualitative interpretation of CCC or RPD or R2 values are extremely subjective and there is debate in the pedometric literature as to the value of placing judgements on these statistics (see, for example, Minasny and McBratney 2013 Why you don’t need to use RPD in Pedometron). The acceptable level of uncertainty will really stem from the purpose of the investigation. If the use of soil spectroscopy is for regional scale mapping, then greater levels of uncertainty are fine. However, if spectroscopy is being used to look for very small differences in soil properties, then the confidence needs to be greater. For this purpose, the model error (RMSE or MAE) would be a more informative statistic for quantitatively assessing whether or not the spectroscopy-based estimates are good enough for the question being posed.     

In addressing these two suggestions of reviewer 1, we have rethought and revised how we discuss the quality of the performance of the spectroscopy based estimates. At many of the sites there is a significant difference between measured and predicted values. The CCC statistic clearly points out that the quality of the predictions at some of the individual sites are not as robust as the other statistics might suggest. This is primarily due to bias at the low end of SOC% values. This sentiment is now incorporated into the abstract, discussion and conclusions.

Another weak aspect in the manuscript is limited clarity on sample numbers across sites and values used in the statistical analyses. The stat analyses appear valid. It is, however, difficult to follow the listed DF values.

We understand how differences in samples numbers were confusing. There was one error in reporting sample numbers between tables but mostly it was just that we scanned more samples than we had analytical data for so the sample numbers could differ between statistical tests. We have carefully gone through all the tables and text to make sure the samples numbers now match and if they do not match, there is a clear explanation of the difference. The main trial that was confusing was KBS and we have ended up redoing the analysis to only focus on samples with analytical data to avoid future confusion (see below).

 The KBS site differs in original sampling strategy, which alters the ANOVA. Additional discussion and clarity are needed on the derivation and evaluation of sample numbers at the KBS site and how it fits in the overall samples evaluated.

We are unsure of what the reviewer is specifically referring to about original sampling strategy at KBS. There may have simply been some confusion as KBS was treated differently from the other trial sites. Here we used a repeated measures ANOVA because we only analyzed one replicate (soils from the same replicate (R4) were used in all years) of each trial over time. This was stated in the text but was not clear in the summary table.

Additionally, this comment made us rethink the KBS data and we realized that there was a slight error in Figure 4 – we had plotted both the supplied SOC and reanalyzed SOC data for the observed trend although we had decided not to use the supplied data because it did not correlate with the MIR data (Table S1).

More specific information across all sites on sample numbers at each depth increment, plot size, and replicate number would add key details, clarify stat results, and help support study conclusions. A few labelling and/or typo errors exist in the figures and tables. Specific comments follow that are referenced to page and line number.

Section 2.1 has been substantially revised to include additional details on the experimental design, soil sampling strategy and which samples were used in this study.

 The manuscript is appropriate for publication in Remote Sensing. A minor revision is needed to address the additional recommended statistical analyses. The outcome of these analyses may alter strength of conclusions. Minor corrections are needed for text errors, and to sharpen a few key details.

 Detailed Comments:

 Abstract: Among the key limitations to building credible soil carbon sequestration programs is the cost of measuring soil carbon change. Diffuse reflectance spectroscopy (DRS) is a feasible low-cost alternative to traditional laboratory analysis of soil organic carbon (SOC). Whereas numerous studies have shown that DRS can produce accurate and precise estimates of SOC across landscapes, whether DRS can detect subtle management induced changes in SOC at a given site has not been answered. Here, we leverage archived soil samples from seven long-term research trials in the U.S. to test this question using mid infrared (MIR) spectroscopy coupled with the USDA-NRCS Kellogg Soil Survey Laboratory MIR spectral library. Overall, MIR-based estimates of SOC% were excellent with the root mean square error ranging from 0.10 to 0.33% across the seven sites. In all but two instances, the same statistically significant (p < 0.10) management effect was found using both the lab-based SOC% and MIR estimated SOC% data. Despite slightly greater uncertainty in MIR-estimates of SOC%, our results suggest that large existing MIR spectral libraries based upon measured SOC values can be operationalized in other laboratories for successful carbon monitoring

 Page 1

 Line 1

There is substantial political and scientific momentum …

Revised as noted.

 Page 3

Add a figure that shows LTR locations in USA. A USA outline with states and sites within states would suffice.

We have added a figure to the supplementary documents (Figure S1) showing the locations of all trials across the US and referenced Figure S1 on line 124.

Table 1

The ID and Treatment entries overlap in the Rodale treatment column.

The Beltsville ID symbol for 2-year rotation is misaligned offsetting subsequent entries.

I have realigned the rows. Apologies this was a formatting issue.

 Page 4,

Table 1 should incorporate the following:

  1. Sample numbers by depth increment for each site.

 

Table 1 contains additional information regarding sample size per depth and we have added some additional sample sizes to the figure captions in the manuscript to assist in discerning differences in sample sizes across years, treatments and depth.

 

  1. plot size and replicate numbers used at each site. 

 

Additional text has been added to the section 2.1 to address plot size and replicates.

 The sample numbers collected at stated depths is unclear. Table 1 data shows that Lincoln REAP and TCSE, Mandan, and Beltsville have samples collected to 150, 150, 60, and 50 cm respectively. As an example, the Lincoln-REAP site has sample depths of 0-7.5, 7.5-15, and 15-30 cm. The full sample range is 0-150 cm. How many samples were collected at each 0-7.5, 7.5-15, 15-30, and 30-150 cm depth increments? Deeper samples will contain smaller quantities of C and are less likely to be modified during the duration of the ARS studies. Carbon values of deeper horizons can influence study results, but C in deeper horizons (e.g., > 50 cm) are unlikely to be monitored for carbon credits. Thus, it is valuable to convey the sample numbers collected at all depth increments.

Not all depths or years at each trial were used in the ANOVA analysis.  Highlighted in bold are the years and depths used in the trial ANOVA.  Only in the overall comparison of observed SOC% and EML predicted were all depths and years used and the relationship across depths is visualized by the color legend (Figure 2). We have added some additional information regarding sample size in Table 1 and Table 4 as well as within Figure captions. 

 Sample numbers and the number of replicates at the KBS site require increased explanation. The repeated measures ANOVA, slope comparisons, and box plots employ different sample numbers than the other sites. It is unclear how the degrees of freedom (df) derive because clarity on sample number is limited.

As explained above, we have revised the KBS analysis and display of KBS data. There was actually an error in the original Figure 4 which included the supplied OC data which we found to be inconsistent with the predictions, thus 28 samples were reanalyzed. We have now run all of the statistics on these 28 samples.

 Spatial variability in OC values presents another inherent challenge for carbon monitoring regardless of the how carbon is measured. Plot size and sample numbers together would provide a gauge of sample density each site.

We completely agree with the reviewer on this point but it is also a huge topic that is well beyond the scope of this paper. Additionally, we are constrained by the agronomic trial sample design that was implemented at each trial. This type of experimental design is attempting to minimize variance rather than capture the spatial variability. One of the promising aspects of soil spectroscopy is that it will enable increased density of soil sampling, so as long as the predictions are unbiased, the net result should be a better spatial representation of SOC across a farm.

We have included more information on plot size, sample numbers and how the samples were collected (individual cores or composite samples) in the methods section (Section 2.1).

 Line 122

All soil samples (1855) used this in this …

Corrected 

Line 122 and Tables S1 and S3

The sample number stated in line 122 – 1855 does not match the 1377 total listed in Table S3 for observed and EML carbon values. The difference is 478 samples. Why this discrepancy? Added explanation is needed.

Table S3 compares means of observed and EML predicted %SOC.  We received in total 1855 soil samples which were analyzed in the Woodwell laboratory, however some of these soil samples did not have accompanying %SOC measurements.  We analyzed an additional number of soil samples at the Woodwell laboratory for %SOC, however we did not analyze all.  Thereby there are more EML predicted %SOC (1855) then we have observed SOC% (1377).

Table S1 and Table S3

The n value (168) in Table S1 for Pendleton differs from the n value (264) for Pendleton in Table S3. Which value is the correct? Or explain the discrepancy, if the values differ.

When we first did the initial outlier screening we did not have the 2014 data from Pendleton. We overlooking revising that analysis when we received the additional samples. We had performed the PLSR screening step with all 264 samples and have updated the results in Table S1 – R2 = 0.89 and RMSE = 0.09% and n = 264. The results were nearly identical, just a very small (0.01%) improvement in the new model fit with no outliers being detected.

 Table S4

Table S4 lists a n value of 362 for Beltsville. Table S1 and Table S3 both show a Beltsville n value of 390. Which value is correct? Which n value was used in the summary stat calculation? Clarification is needed. Additional explanation may be needed dependent upon answers to these questions.

The 390 is correct, it was a typo in input for 362.  Please also note that due to moving tables, this is now Table 2.

 Line 132

How was the KSSL 240-subset selected? What criteria were employed? Or was the selection random? Did the subset include all sites? Need greater explanation.

This selection process was not explained in the initial submission. The samples come from all trials and represent all treatments, depths and years. We had added the following sentences (Line 155): “The 240 samples were chosen to span all locations, at least one replicate of each treatment in multiple years and the full ranges of depths for the chosen treatments. This resulted in 28 samples from KBS, 36 from Lincoln-REAP, 52 from Rodale, 44 from Mandan, 44 from Pendleton and 36 from Beltsville.”

 Page 6

 Line 225  

effect size

Corrected

 Line 230

Why were slopes tested only for the KBS site? Is the slope analysis simply an additional evaluation? Or was there a specific reason to evaluate the KBS site?

KBS was the only site were we had just a long time series of data for each treatment so we felt this was an interesting way of comparing the data in addition to the repeated measures ANOVA.

 Line 240

The EML predicted SOC% at a slightly larger range (0.04-6.54%) across all trials (Table S3) compared to corresponding observed measurements (0.02-6.40%).

This is an interesting observation. Looking at all of the trials, 4 out of the 7 trials have a greater range in the predicted v. observed data (using interquartile range). We have added this observation (line 270) into the results along with a discussion of the t-test between predicted v. observed mean OC values (line 385+).

Page 7

 Table S3

The data and analysis in Table S3 is key central focus and goal of the manuscript it should be included in the manuscript itself not in the supplementary documents.

After the addition of CCC and bias into Table S4, we actually feel that is the more critical table to put in the main manuscript itself. This change has been made and is now Table 2 in the main manuscript.

At all sites and in the summary data, the Obs mean values are greater than the EML mean values. A paired T test of the means would determine if the means are statistically different.

This has been addressed above.

 Figure 2

The labeled regressions for (g) and (h) appear reversed.  Graph (g) matches data from Rodale and graph (h) matches Lincoln-TSCE data. 

Yes these labels were reversed.  This has been fixed in Figure 2 caption and the order has been corrected in Table 2 (formerly Table S4).

 On screen, Figure 2 is slightly blurred. What is resolution of the figure?

Also, the vertical scale differs (shorter) from the horizontal scale, even though values are of the same variable. The scale differences complicate value comparisons and line slopes. Equivalent horizontal and vertical scales would streamline visual comparison but increase figure size. This reviewer recommends increasing Figure 2 resolution and creating graphs with equal axes scale. Also, site names could be directly labeled on each graph, as sufficient space is available.   

We removed the vertical and horizontal lines in all manuscript figures as this caused difficulty in visualizing data and increased the y-axis size.  We added labels for each graph in Figure 2.  Graphs were submitted as 300 dpi as required by the journal however we have increased to 600 dpi for greater clarity and we have increased the length of the y-axis. We cannot make the graphs square because they would not fit on one page.

 Line 248

Mandan, Lincoln-REAP and Lincoln-TCSE) while whereas Beltsville and Pendleton had slopes > 1 (Table S4).

Corrected

 Line 263

dropping decreasing from 0.95 to 0.91.

Corrected

 Page 8

 Line 276

However, there were two cases (Lincoln Reap and Beltsville)….

Corrected

 Line 286.

The EML detects the differences, however the mean values are lower at all depths.

We revised this sentence to include the statement “but at lower absolute values.”

 Line 287

Table S5 gives 144 df for EML OC and Figure S1 plots (visual count) about that same number. The discussion does not state the source of the additional samples and data points. Table S3 lists an of 28 for KBS. More explanation is needed about the additional 116 spectroscopy measurements. It is unclear how the extra data points and samples were derived. The reason for the added samples stems from the way replicates were sampled. This needs greater explanation.

Please note that there with the data from the KBS site we received more samples than there were observed measurements.  The goal of comparing the limited observed KBS observations with the greater number of EML predictions was to examine how well changes were captured in the predictions with differing sample sizes. However as previously stated we have revised this analysis and figure to only include data in which we have observed and predicted data.

 The Box plot in S1 that includes the additional KBS data shows a much greater range for the spectroscopy data compared to the observed OC.  This suggests that OC has considerable spatial variability within the study plots that is not captured in the observed data. The increased number of measurements afforded by MIR based estimates can provide greater resolution on spatial variability even though precision is slightly less than high temperature combustion.

The additional points in Figure S2 are because of greater temporal rather than spatial coverage. In that sense, it is not an apples-to-apples comparison. However, in response to the confusion around samples numbers we have redone the analysis of the KBS trial to only include the samples that have analytical data so that it is an apples-to-apples comparison.

Line 293

Reduced-input and never-tilled treatments (T3, T8, Figure 4f) observed and EML-predicted

Corrected

 Supplemental Documents

 Figure S4

In caption “(d)” should be (b)

Corrected

 Page 10

 Figure 4 The KBS observed (measured) and predicated data plotted in this figure show about 10 points in each of the six graphs, which yields roughly 60 samples. Table S3 and page 4 line160 state that the KBS site had 28 samples. This point was made previously. Added explanation for the additional data points is needed somewhere in the manuscript.

As explained above, this was due to a data error that has been corrected in the figures and tables.

 Line 327

Differences among treatments at Lincoln TCSE within years 1999 and 2012 (Figure S3) were detected. Figure S3 shows year as 2011.

Apologies about the confusion.  Data years spanned 2011-2012.  The manuscript text has been corrected for consistency.

 Page13

 Line 426

Spectroscopy-based estimates of SOC% (Figure 4) (Figure 6). These estimates aligned with previous investigations

Corrected

 Line 452

The reduced cost and increased throughput afforded by soil spectroscopy can allow for greater larger sample numbers sizes 

 Corrected

References

References 1, 3 11, 15, 21, and 25 list first author name only followed by et al. A full author list should be included for each of these references.

These have been corrected.

  References cited in review comments

 Lawrence, I. and Lin, K. 1989. A concordance correlation coefficient to evaluate reproducibility. Biometrics, 255-268.

 McBride, G.B., 2005. A proposal for strength-of-agreement criteria for Lin’s concordance correlation coefficient

Author Response File: Author Response.docx

Reviewer 2 Report

The authors of this research study present a very interesting question with the right answer and a well-written manuscript. The paper deals with the use of MIR laboratory DRS (non-destructive) for detecting SOC on a wide selection of archived agricultural soil samples from seven long-term research sites representing diversity in management, climate, and soil types of the United States. They measure the spectral range from 1666 nm to 55555 nm and built PLSR models for each LTR sample set individually as a way of screening for potential errors in the analytical or spectral data. Their estimates of SOC% were optimal with RMS between 0.10 and 0.33%. 

They conclude their abstract with the sentence: "Despite some additional uncertainty in the MIR estimates of SOC%, these results suggest that large existing MIR spectral libraries can be operationalized in other laboratories for successful carbon monitoring."

In my opinion, however, this operation is not cost-effective and not sustainable for a real monitoring program of SOC estimated through remote sensing data. Moreover, there is no mention of real applicability using the actual remote sensing data with spectral/spatial resolutions, SNR, and so on in the spectral range, they used for this study.

I suggest the authors two ways:

  1. Submit the article in the present form to another journal more pertinent to the topic they are presenting, i.e. the use of DRS spectroscopy-based estimates of SOC to detect subtle management induced changes over time where.  
  2. Re-submit after new consideration, experiments, and description of the downscaling of their technique to actual remote sensing technology so as to understand its applicability to a more actual monitoring program. See e.g. the works of (Castaldi, F., et al. "Evaluation of the potential of the current and forthcoming multispectral and hyperspectral imagers to estimate soil texture and organic carbon." Remote Sensing of Environment 179 (2016): 54-65.) or (Gholizadeh, A., et al. "Soil organic carbon and texture retrieving and mapping using proximal, airborne and Sentinel-2 spectral imaging." Remote Sensing of Environment 218 (2018): 89-103.) or (Angelopoulou, T., et al. "Remote sensing techniques for soil organic carbon estimation: A review." Remote Sensing 11.6 (2019): 676.).

Author Response

We thank Reviewer 2 for their suggestions and have responded below (italics).

Comments and Suggestions for Authors

The authors of this research study present a very interesting question with the right answer and a well-written manuscript. The paper deals with the use of MIR laboratory DRS (non-destructive) for detecting SOC on a wide selection of archived agricultural soil samples from seven long-term research sites representing diversity in management, climate, and soil types of the United States. They measure the spectral range from 1666 nm to 55555 nm and built PLSR models for each LTR sample set individually as a way of screening for potential errors in the analytical or spectral data. Their estimates of SOC% were optimal with RMS between 0.10 and 0.33%. 

They conclude their abstract with the sentence: "Despite some additional uncertainty in the MIR estimates of SOC%, these results suggest that large existing MIR spectral libraries can be operationalized in other laboratories for successful carbon monitoring."

In my opinion, however, this operation is not cost-effective and not sustainable for a real monitoring program of SOC estimated through remote sensing data. Moreover, there is no mention of real applicability using the actual remote sensing data with spectral/spatial resolutions, SNR, and so on in the spectral range, they used for this study.

I suggest the authors two ways:

  1. Submit the article in the present form to another journal more pertinent to the topic they are presenting, i.e. the use of DRS spectroscopy-based estimates of SOC to detect subtle management induced changes over time where.  
  2. Re-submit after new consideration, experiments, and description of the downscaling of their technique to actual remote sensing technology so as to understand its applicability to a more actual monitoring program. See e.g. the works of (Castaldi, F., et al. "Evaluation of the potential of the current and forthcoming multispectral and hyperspectral imagers to estimate soil texture and organic carbon." Remote Sensing of Environment 179 (2016): 54-65.) or (Gholizadeh, A., et al. "Soil organic carbon and texture retrieving and mapping using proximal, airborne and Sentinel-2 spectral imaging." Remote Sensing of Environment 218 (2018): 89-103.) or (Angelopoulou, T., et al. "Remote sensing techniques for soil organic carbon estimation: A review." Remote Sensing 11.6 (2019): 676.).

 

We appreciate the concern of the reviewer that we presented only a proximal sensing study to the journal Remote Sensing. However, we were approached by the journal to submit a paper to this special issue which focuses on proximal and remote sensing. While lab-based MIR spectroscopy is not easily transferable to remote sensing, the chemometric applications discussed in this paper certainly are. We will defer to the editors of this special issue as to whether or not the topic is appropriate.

Reviewer 3 Report

The topic of the work is not new, but the use of innovative methods that can quickly and cheaply determine SOC is attractive to scientists engaged in research related to the content of organic carbon in soils. The work is clearly written, supplemented by a suitable number of tables and figures, which provide an objective view of the ability of DRS to predict SOC contents in various studies. It should be appreciated that the authors provide source data in supplementary materials. I suggest publishing the manuscript in its present form. Only one correction should be made:

Page 13, line 426: substitute “(Figure 4)” with “(Figure 6)”

Author Response

We thank Reviewer 3 for their comments.

 

Comments and Suggestions for Authors

The topic of the work is not new, but the use of innovative methods that can quickly and cheaply determine SOC is attractive to scientists engaged in research related to the content of organic carbon in soils. The work is clearly written, supplemented by a suitable number of tables and figures, which provide an objective view of the ability of DRS to predict SOC contents in various studies. It should be appreciated that the authors provide source data in supplementary materials. I suggest publishing the manuscript in its present form. Only one correction should be made:

Page 13, line 426: substitute “(Figure 4)” with “(Figure 6)”

 

Thank you.  This has been corrected.

Reviewer 4 Report

The paper “Can Agricultural Management Induced Changes in Soil Organic Carbon be Detected using Mid-infrared Spectroscopy?” submitted by the researchers: Jonathan Sanderman , Kathleen E Savage * , Shree R.S. Dangal , Gabe Duran , Charlotte Rivard , Michel Cavigelli , Hero T. Gollany , Virginia Jin , Mark A. Liebig , Emmanuel C. Omondi , Yichao Rui , Catherine Elizabeth Stewart , proves to be a very consistent piece of research, made by a very experienced team. Congratulations to the author's team.

Author Response

We thank Reviewer 4 for their comments.

Comments and Suggestions for Authors

The paper “Can Agricultural Management Induced Changes in Soil Organic Carbon be Detected using Mid-infrared Spectroscopy?” submitted by the researchers: Jonathan Sanderman , Kathleen E Savage * , Shree R.S. Dangal , Gabe Duran , Charlotte Rivard , Michel Cavigelli , Hero T. Gollany , Virginia Jin , Mark A. Liebig , Emmanuel C. Omondi , Yichao Rui , Catherine Elizabeth Stewart , proves to be a very consistent piece of research, made by a very experienced team. Congratulations to the author's team.

Round 2

Reviewer 1 Report

Follow up Review of Remote Sensing-1206452 – “Can agricultural management induced changes in soil organic carbon be detected using mid-infrared spectroscopy.

The authors have earnestly address comments and suggested edits identified in my initial review. The manuscript should be accepted as revised.

A few minor typographical errors need correction

Line 33 political misspelled

Line 97 delete the word located

Line 208  memory-based should be hyphenated

Line 212 should be - principal components not principle

Line 327 significant misspelled

Line 394 Pendleton misspelled

Line 396 precision misspelled

Line 433 negligible misspelled

Line 502 consistently misspelled

Reviewer 2 Report

Again the paper is well-written and with a complete description of the use of DRS as a viable low-cost alternative to traditional laboratory analysis of SOC. 

Moreover, I believe that it is important that the authors mention this issue in their manuscript by adding constrain, limitations, and advantages of using MIR spectroscopy for SOC estimation.

In my opinion, currently, lab-based MIR spectroscopy is not easily transferable to remote sensing. 

I let the editors the final decision if the topic is appropriate for their SI.

Back to TopTop