# UBathy: A New Approach for Bathymetric Inversion from Video Imagery

*Reviewer 1:*Anonymous

*Reviewer 2:*Anonymous

*Reviewer 3:*Anonymous

*Reviewer 4:*Anonymous

**Round 1**

*Reviewer 1 Report*

This paper presents a relatively novel way of inferring bathymetry from video imaging using PCA rather than the methods inherent in CBathy. Given the scientific community's current dependence on CBathy, providing an alternative means of bathymetric inversion will be of great interest to readers. A few suggested changes are needed, however, to highlight both the potential impact of the methods described in this work, as well as the limitations inherent in the novel method being described.

At the end of the 4th paragraph in the introduction (Lines 65-66), the authors note that there are limitations and/or known problems with Cbathy. These problems and limitation should be succinctly reviewed in the introduction. A few sentences should suffice. This will allow the authors to more clearly set up exactly how their proposed method will address and improve on the challenges they note with Cbathy.

A more critical comment is that the authors are very vague on the computation time required to run UBathy to a successful conclusion. Given how computationally extensive a full run of CBathy can be, this is not a minor oversight. A method that improves CBathy but that, for example, takes 2x the computing power, is not the same contribution as a method whose improvements require only a minor increase in computing power. In fact, the authors note several times in the text that, at times, UBathy is computationally expensive, but do not provide details. Not all researchers have unlimited resources with respect to computing time or power, and a proper assessment of this new method needs to include a comparison not just of the results in the selected test cases, but also of the computation power required to derive the results. This should not be extensive to address; an additional column in Tables 3-7 that provides an estimate of average computational time to achieve the noted results and RMSE errors, to also include relevant explanations in the associated text, would help address this oversight. In addition, a similar column in Table 9 allow comparison of the computation time for the presented results obtained with CBathy vs. UBathy.

*Author Response*

This paper presents a relatively novel way of inferring bathymetry from video imaging using PCA rather than the methods inherent in CBathy. Given the scientific community's current dependence on CBathy, providing an alternative means of bathymetric inversion will be of great interest to readers. A few suggested changes are needed, however, to highlight both the potential impact of the methods described in this work, as well as the limitations inherent in the novel method being described.

**Reply1:** we very much appreciate the thorough revision and comments of the Reviewer. All the modifications are highlighted in the annotated new version of the manuscript. We must remark here that the goal of this work is to introduce the methodology. Performing a fully detailed comparison of the proposed methodology with cBathy is out of the scope of the present work. In spite of that, of course, some comparisons are done. Please find the replies below.

At the end of the 4th paragraph in the introduction (Lines 65-66), the authors note that there are limitations and/or known problems with Cbathy. These problems and limitation should be succinctly reviewed in the introduction. A few sentences should suffice. This will allow the authors to more clearly set up exactly how their proposed method will address and improve on the challenges they note with Cbathy.

**Reply2:** the goal of introducing the sentence “Some limitations and/or known problems of cBathy have been reported in the literature” is to emphasize that the bathymetry inversion problem is not solved yet. We do not mean that the new methodology will necessarily solve this problems. Actually, the only comparison we have done to cBathy is positive, but we should make as systematic work to make any conclusion. Some of the aspects that are known to be a problem in cBathy are: high wave heights (this problem remains in uBathy, at least to some extent), dealing with wet/dry tiles (this is not a problem in uBathy, although it is not clearly shown in this work) and dealing with waves

longer than the tiles (this is not a problem in uBathy since it work globally in space). In any case, we prefer to be very careful in this regard. We have added a short clarification in this regard . The changes are highlighted in the new version of the manuscript.

A more critical comment is that the authors are very vague on the computation time required to run UBathy to a successful conclusion. Given how computationally extensive a full run of CBathy can be, this is not a minor oversight. A method that improves CBathy but that, for example, takes 2x the computing power, is not the same contribution as a method whose improvements require only a minor increase in computing power. In fact, the authors note several times in the text that, at times, UBathy is computationally expensive, but do not provide details. Not all researchers have unlimited resources with respect to computing time or power, and a proper assessment of this new method needs to include a comparison not just of the results in the selected test cases, but also of the computation power required to derive the results. This should not be extensive to address; an additional column in Tables 3-7 that provides an estimate of average computational time to achieve the noted results and RMSE errors, to also include relevant explanations in the associated text, would help address this oversight. In addition, a similar column in Table 9 allow comparison of the computation time for the presented results obtained with CBathy vs. UBathy.

**Reply3:** we understand the Reviewer concern, and we have tried to solve his/her questions in this regard. Whenever phase fitting is used, the computational time is “small”. For the cases run, phase fitting without windowing takes few tens of seconds at most, while phase fitting with windowing takes a few tens of minutes (~20 minutes). This computational effort is the same order of magnitude of that for cBathy. Only when function fitting is used, in order to handle wave reflection (of note, cBathy cannot handle with wave reflection) the computational is increased, approximately in two orders of magnitude. We understand that this information is relevant to the reader and we have added the information in the new version of the manuscript.

Given that the computational times are machine dependent, we have decided to limit the given information in the new version of the manuscript to : 1) cBathy (phase fitting) and uBathy take times in the same order of magnitude and 2) function fitting if two orders of magnitude more expensive. A more exhaustive benchmarking comparison has to be carried out on a specific study of comparison between the two algorithms. We hope that the Reviewer understands/agrees with our decision.

*Reviewer 2 Report*

The authors addressed the problem of reconstructing bathymetry via *PCA *and show both the quality and problems of the obtained results.

The paper is exhaustive, well written and of easy reading. Thus, it deserves being published with only minor revisions.

Minor issues:

line 45: Stockdon and Holman (2000) should be [11]

line 49: [12,27,e.g.] should be [e.g.: 12,27]

line 120: if the first R

What does "most" mean? The topic of a reduced dimensional solution in PCA is dealt with very superficially in most literature, unless specifically addressed. There are a lot of proposed estimation methods, which in my opinion do not work well, see Camiz, S., & Pillar, V. (2018). Identifying the Informational/Signal Dimension in Principal Component Analysis. *Mathematics*, *6*(11), 269 and the quoted literature therein. The worst are the currently adopted methods: fixing a total explained inertia, select eigenvalues larger than the average, and scree plots. Indeed, the only reasonable cutpoint would be distinguishing signal from noise: is there any way in this specific context to identify it? Can you give some contribution?

Table 1: I am curious about the use of terms like mono- and bi-chromatic: is it common in bathymetry their use? Maybe the PCA results may be reported in an extra column at the end.

lines 356-7 Stockdon and Holman (2000) should be [11]

lines 407 and 414 Holman et al. (2013) should be [12].

*Author Response*

The authors addressed the problem of reconstructing bathymetry via PCA and show both the quality and problems of the obtained results. The paper is exhaustive, well written and of easy reading. Thus, it deserves being published with only minor revisions.

**Reply1:** we very much appreciate the thorough revision and comments of the Reviewer. All the modifications are highlighted in the annotated new version of the manuscript.

**Minor issues:**

line 45: Stockdon and Holman (2000) should be [11]**Reply2**: we have modified the manuscript following the suggestion.

line 49: [12,27,e.g.] should be [e.g.: 12,27]**Reply3:** we have modified the manuscript following the suggestion.

line 120: if the first RWhat does "most" mean? The topic of a reduced dimensional solution in PCA is dealt with very superficially in most literature, unless specifically addressed. There are a lot of proposed estimation methods, which in my opinion do not work well, see Camiz, S., & Pillar, V. (2018). Identifying the Informational/Signal Dimension in Principal Component Analysis. Mathematics, 6(11), 269 and the quoted literature therein. The worst are the currently adopted methods: fixing

a total explained inertia, select eigenvalues larger than the average, and scree plots. Indeed, the only reasonable cutpoint would be distinguishing signal from noise: is there any way in this specific context to identify it? Can you give some contribution?**Reply4:** the comment on R < Q modes representing most of the variance was part of the general description of the PCA, but it is actually not relevant in this work. In fact, when dealing with wave signals, if the recovered frequency is good (sigma_w / w < 15%), although the variance represented by a wave field can be small, it will be used for bathymetric retrieve. Our “contribution” would go in that way (i.e., we focus more on the “quality” of the mode than in its variance: in most cases, however, the wave field corresponds to the first mode). In any case, since we think that this is not fundamental for the work, we have deleted the sentence for clarity of the text.

Table 1: I am curious about the use of terms like mono- and bi-chromatic: is it common in bathymetry their use?**Reply5:** we understand the reviewer confusion because the original manuscript was not clear enough in this regard. In real conditions, the sea state might be the result of the superposition of waves coming from only one area or from two or more different areas, and this is why we consider the superposition of waves as a case of interest. We use the terms “monochromatic” and “bi-chromatic” to refer to situations where the signal has one or two frequencies and wavenumbers, respectively. In Section 3 the “polychromatic case” results from the combination of three wave fields (with different frequencies, wavenumber modules and directions). In Section 2, since the examples are 1D, we only

play with two wave trains of different frequencies and wavenumbers. We have modified the manuscript (around lines 142 to 152) to be more clear in this regard.

Table 1: Maybe the PCA results may be reported in an extra column at the end.**Reply6:** Table 1 in Section 2 is only meant to describe the wave conditions imposed in the three 1D examples, and it is equivalent to the old Table 2 (i.e., using the numbering of the previous version) in Section 3. However, following the Reviewer’s suggestion, we have added a new table (new Table 2) that summarizes the PCA results for the 1D examples (equivalent to old Table 6 of Section 3).

lines 356-7 Stockdon and Holman (2000) should be [11]

**Reply7:** we have modified the manuscript following the suggestion.

lines 407 and 414 Holman et al. (2013) should be [12].

**Reply8:** we have modified the manuscript following the suggestion.

Author Response File: Author Response.pdf

*Reviewer 3 Report*

In the manuscript, an improvement of the method is proposed for reconstructing the bathymetry from video recordings of the water surface displacement field induced by the propagation of waves in the coastal zone using the well-known dispersion relation for linear surface gravity waves in a fluid of finite depth.

The method is based on the spectral analysis of the wave field and its decomposition on empirical orthogonal functions. The algorithm of the method is described in the paper in details, and the examples of its use for model wave fields of different dimensions in space (1D and 2D) and different in composition (monochromatic waves, their superposition, non-linear wave fields) are given. One example of using the algorithm for field data is also given.

It is shown that this rather complicated technically algorithm gives slightly better results compared to the previous version “cBathy”, however, the accuracy of determining the bathymetry still remains unsatisfactory in the case of real and complex model wave fields.

In conclusion, a recommendation is made on the use of the proposed algorithm, consisting in the fact that for a correct and more accurate result (depth values in the considered area), one must try to wait «for adequate conditions: monochromatic waves of small height (ideally), with an adequate wave period for the desired depths to be measured». But for such ideal conditions, standard spectral analysis algorithms should work well, for example, Fourier analysis (in time and space), which easily gives the dispersion curves necessary for calculating depths, allows one to see the wave directions, new harmonics generated by nonlinear interactions on the sum and difference of frequencies of incident waves, etc., and at the same time much easier to use and interpret the results.

It would be useful to compare the proposed algorithm with the Fourier method (at least when determining frequencies and wave numbers in the simplest considered model cases).

*Author Response*

In the manuscript, an improvement of the method is proposed for reconstructing the bathymetry from video recordings of the water surface displacement field induced by the propagation of waves in the coastal zone using the well-known dispersion relation for linear surface gravity waves in a fluid of finite depth.

The method is based on the spectral analysis of the wave field and its decomposition on empirical orthogonal functions. The algorithm of the method is described in the paper in detail, and the examples of its use for model wave fields of different dimensions in space (1D and 2D) and different in composition (monochromatic waves, their superposition, non-linear wave fields)

are given. One example of using the algorithm for field data is also given.**Reply1:** (a remark) we would not refer to our method as a spectral analysis. We use PCA directly on the signal (its Hilbert transform) and do not analyse any spectra, even though we obtain some important frequencies. In other words, what we apply is not a Spectral Principal Component Analysis.

It is shown that this rather complicated technically algorithm gives slightly better results compared to the previous version “cBathy”, however, the accuracy of determining the bathymetry still remains unsatisfactory in the case of real and complex model wave fields.**Reply2:** (a remark) we do not agree with the Reviewer statement “rather complicated technically algorithm”. On the one hand, even though the cBathy JGR paper presents the algorithm in a very simplified way, cBathy is a very complicated algorithm (see a description below, at “The cBathy algorithm”), which makes sense since the goal is not easy and some mathematical tools must be used. On the other hand, the new proposed method consists in: 1) Hilbert transform of the matrix (python or matlab have excellent routines), 2) PCA of the matrix (similarly) and 3) to fit the angles of the first mode or modes. Compared to the “cBathy” algorithm, we understand that the proposed

methodology is not complicated. In the manuscript, while we give a short description of the cBathy algorithm, we avoided a detailed description of cBathy (given below).

Regarding the accuracy and comparison of both inversion algorithms, our goal was to show that our algorithm can handle real cases, not that it will always improve “cBathy” results. A detailed comparison of different inversion algorithms is out of the scope of the paper. However, in this reference test case the new approach increases, compared to cBathy, the solved region from 60% to 84% while the bias is reduced from ~50cm to ~25cm. In any case, we are introducing a methodology that it is not in the literature, and we find that that having different tools to invert the bathymetry is valuable.

In conclusion, a recommendation is made on the use of the proposed algorithm, consisting in the fact that for a correct and more accurate result (depth values in the considered area), one must try to wait «for adequate conditions: monochromatic waves of small height (ideally), with an adequate wave period for the desired depths to be measured». But for such ideal conditions,standard spectral analysis algorithms should work well, for example, Fourier analysis (in time and space), which easily gives the dispersion curves necessary for calculating depths, allows one to see the wave directions, new harmonics generated by nonlinear interactions on the sum and difference of frequencies of incident waves, etc., and at the same time much easier to use and interpret the results.

It would be useful to compare the proposed algorithm with the Fourier method (at least when determining frequencies and wave numbers in the simplest considered model cases).**Reply3:** As far as we know, there is no reference in the literature exploring the use of standard spectral analysis (Fourier) to the depth inversion problem. The reason underneath is that this simple method does not work. For this reason more complex algorithms, like cBathy or the one presented in our manuscript, have been developed.

A standard spectral analysis would only work for a monochromatic wave field with CONSTANT wavenumber (i.e., constant water depth). Also, we note that cBathy does not work well for clean signals (see below), and this is another reason why we do compare the uBathy results with any other method (cBathy or a “standard spectral analysis”, if it was possible) for synthetic cases.

Naturally, there are favorable wave conditions to apply the new method (just the same as in in situ methods), but that does not mean that any algorithm would work in such reasonably good conditions. Further, we describe what are the optimal conditions (we feel compelled to do so), but the algorithm also works for other conditions. Unless some specific suggestions are given, we prefer not to modify the manuscript in this regard.

**The cBathy algorithm**

As a previous step, cBathy computes the spectrum of each pixel’s temporal signal (data points), and stores their phases. Then, it tries to guess the bathymetry of a mesh of points defined by the user (inversion points). A neighborhood around each point is defined (the “tiles”: cBathy works locally, not globally as uBathy, to find the modes). The algorithm performs a local analysis over each tile, to guess the bathymetry at each point. The definition of the tiles affects enormously the result of the inversion. However, the selection of optimum tiles is a problem that is still not resolved (at least in version 1.2 of

cBathy). For each inversion point only the pixels that are within its tile are considered. The spectral cross-correlation between all the tile pixels’ spectra is computed and stored as a three-dimensional array (CSM). If a big tile is used, or if there is a need for a high spectral resolution, a lot of computation resources are needed. The CSM is then averaged for different frequency bands, selected by the user. The selection of the bands is another set of parameters which is left up to the user, and its selection also affects the inversion result enormously.

The band averaging tends to give numbers of bigger amplitude over bands which contain the peaks of the waves’ signals for the case of signals corrupted with noise. This happens because the spectrum is dominated by the signal of interest over the bands where its amplitude is big enough (over the peaks), and dominated by the noise elsewhere. Over those bands, spectral phases are aligned or randomly distributed, respectively. Note that this does not happen for the case of uncorrupted signals, where the tendency is to have aligned phases almost everywhere (the zeroes of the spectral response of the windowing function give a slightly smaller amplitude averages: paradoxically, it works badly for a very clean signal). To capture this general (over the tile’s pixels pairs) tendency, the amplitudes of the band averages are averaged over all the pixel pairs. This is the coherence function.

The bands that present the bigger coherence are considered as possible candidates to contain wave signals, and therefore are further analyzed (separately). The way to analyze each band consists in extracting the eigenvector of the CSM over the selected band associated with the largest eigenvalue. This eigenvalue is assumed to contain the spatial phase of the wave over the pixels of the tile (this is, the first EOF over the CSM on the band of interest is selected, as a way to guess the main responsible for the CSM phases, while removing the noise perpendicular to it). This phase evolution is only further analyzed if its eigenvalue stands enough over the other ones.

The tile is then assumed small enough to let the spatial phase be modelled as generated by a constant wavenumber. A complex exponential of constant frequency (the wavenumber) is fitted to the phase of the first eigenvector. The fitting is fed with a crude first approximation of the wavevector components computed as the median of phase evolution over cross-shore and alongshore transects, used as a seed for the nonlinear fitting (notice that function fitting is computationally expensive). The fitting gives different weight to the phase associated with each pixel, depending on the amplitude of the eigenvector in that position and its distance to the centre of the tile.

The fitted wavenumber is then compared to the shallow and deep water limits that the frequency estimated for the wave indicates, and the result is only accepted (or rejected, with all the computation that required) if it falls between these two limits.

Finally, all the pairs of frequency and wavenumber that lie within the tile (this is, the pairs associated with each monochromatic wave over all the inversion points contained within the tile of the current inversion point) are fitted to the dispersion equation. The depth that generates the dispersion relation that best fits the pairs is taken to be the depth of the inversion point guessed by cBathy. Again, the fitting is weighted, using in this case a combination of the skill of the wavenumber fit, the value of the wavenumber, the value of the eigenvalue associated with the eigenvector fitted to give the wavenumber, the depth-wavenumber sensitivity associated to the wavenumber and the distance between the point related to the wavenumber-frequency pair and the current inversion point.

Author Response File: Author Response.pdf

*Reviewer 4 Report*

The article “UBathy: A New Approach for Bathymetric Inversion for Video Imaginery” by Simarro et al. proposes a methodology for inferring the bathymetry from the analysis of the wave frequency and wave number at a given coast. The methodology is tested with monochromatic and polychromatic wave trains under different bathymetric scenarios. Finally, the method is compared with cBathy (Holman et al. 2013), by using the same benchmark from Duck (NC). The proposed UBathy seems to recover the bathymetry with higher accuracy at the shallower water depths than cBathy.

I have found the article an enjoyable reading and within the scope of Remote Sensing. The current trend in Ocean Modelling goes towards high resolution at the coastal area. However, the bathymetry plays a primary role at such areas, not only in wave propagation, but also in nearshore circulation. Periodic update of the bathymetry is deemed necessary, but it is not an easy (neither cheap) task. This paper addresses a current shortcoming and proposes a cost-affordable solution.

The methodology is consistent with the state-of-the-art. The sections are well balanced among themselves. The introduction states clearly the problem. The paper also provides enough detail for reproducing the method. I would really like to see this manuscript published.

My main concern with this paper would be further emphasis on how to properly use this algorithm. Despite that some use ranges can be obtained, be it implicit or explicit throughout the text, more clarity on this point would help to strengthen the impact of this contribution.

As a minor remark, I would suggest to shorten some sentences. Some sentences are long or hard to understand. I would provide some examples below.

Hence, my recommendation would be the acceptance of this manuscript after minor revisions. I consider that the potential impact of this contribution will be increased after addressing minor issues in the Discussion section and editing.

**General remarks**

1. I deeply appreciate that not all results shown are excellent. For instance, the authors remark the limitations of the method in the reflective case (lines 227-229).

The accuracy of the method seems to strongly depend on the accuracy of recovering the wave frequencies and wave numbers. This task is far from trivial at real cases.

My main question is: what would be the proper range of application of this methodology? I would appreciate further comments on the discussion section about these four aspects:

(i) The impact of the beach slope: All the cases presented in this manuscript have a gentle dissipative slope. How would the method perform in a reflective beach slope? Would the bathymetry be so accurate?

(ii) Wave breaking: This goes in the same line. Wave breaking with gentle slopes and low amplitudes lead to “gentler” breaking. What would happen with steep waves under steep slopes?

(iii) Wave direction: Figure 8 (bottom,left) show the isobaths parallel to the wave crests. The wave angles at all experiments are close to perpendicular at the shore. That is normal, because at such shallow water depths, refraction and shoaling are relevant in the wave propagation. Additionally, FUNWAVE does not perform well with oblique waves. How would the method behave under waves oblique to the shoreline? For instance, how would behave this method under an oblique mature swell?

(iv) The effect of tides: The method seems to perform worse at deeper water depths. In a microtidal environment, such effect could be considered as a minor issue. However, what would happen at a macrotidal environment? When would be the best moment for inferring the bathymetry (i.e. the flood or ebb tide?).

I think that these clarifications would avoid potential misuses of this method by any interested researcher.

2. Some sentences are long and hard to interpret. For instance, lines 249-251; lines 388-390. Shorter sentences would enhance readability.

3. What is the computational cost of the whole methodology? The authors use the concept “high computational cost” (line 460), but it is hard to establish a framework for comparison. What would be the order of magnitude? Seconds, hours, days? Given the relative long-time between significant bathymetric changes (be it hours in case of wave extreme events) and the model update, computational time does not seem a great concern.

**Specific remarks**

line 266: Please justify why not windowing was applied. Shall we apply windowing at all cases?

Table 6: Despite it is straightforward to compute, please consider to include the wave amplitudes for the different factors.

Figure 3: I have review this manuscript with a printed version. The dashed lines can hardly be seen. May you increase the line width? Same comment for the “blue lines for the exact depth”.

*Author Response*

The article “UBathy: A New Approach for Bathymetric Inversion for Video Imagery” by Simarro

et al. proposes a methodology for inferring the bathymetry from the analysis of the wave

frequency and wave number at a given coast. The methodology is tested with monochromatic

and polychromatic wave trains under different bathymetric scenarios. Finally, the method is

compared with cBathy (Holman et al. 2013), by using the same benchmark from Duck (NC). The

proposed UBathy seems to recover the bathymetry with higher accuracy at shallower water

depths than cBathy.

I have found the article an enjoyable reading and within the scope of Remote Sensing. The

current trend in Ocean Modelling goes towards high resolution at the coastal area. However, the

bathymetry plays a primary role at such areas, not only in wave propagation, but also in

nearshore circulation. Periodic update of the bathymetry is deemed necessary, but it is not an

easy (neither cheap) task. This paper addresses a current shortcoming and proposes a

cost-affordable solution.

The methodology is consistent with the state-of-the-art. The sections are well balanced among

themselves. The introduction states clearly the problem. The paper also provides enough detail

for reproducing the method. I would really like to see this manuscript published.

My main concern with this paper would be further emphasis on how to properly use this

algorithm. Despite that some use ranges can be obtained, be it implicit or explicit throughout the

text, more clarity on this point would help to strengthen the impact of this contribution.

As a minor remark, I would suggest to shorten some sentences. Some sentences are long or

hard to understand. I would provide some examples below.

Hence, my recommendation would be the acceptance of this manuscript after minor revisions. I

consider that the potential impact of this contribution will be increased after addressing minor

issues in the Discussion section and editing.

Reply1: we very much appreciate the thorough revision and comments of the Reviewer.

All the modifications are highlighted in the annotated new version of the manuscript.

General remarks

1. I deeply appreciate that not all results shown are excellent. For instance, the authors remark

the limitations of the method in the reflective case (lines 227-229).

The accuracy of the method seems to strongly depend on the accuracy of recovering the wave

frequencies and wave numbers. This task is far from trivial at real cases.

My main question is: what would be the proper range of application of this methodology? I

would appreciate further comments on the discussion section about these four aspects:

(i) The impact of the beach slope: All the cases presented in this manuscript have a gentle

dissipative slope. How would the method perform in a reflective beach slope? Would the

bathymetry be so accurate?

Reply2: having a reflective beach slope has several consequences on the wave

propagation (our signal for depth inversion): the slope influences the wave breaking

pattern (see Reply3 below), it reduces the cross-shore domain for a given water depth

and, most importantly for this contribution, it can introduce wave reflection. Having

significant wave reflection is a critical aspect in this method and we have shown that it

requires the use of the “function fitting” for recovering the wavenumber, which is

computationally much more expensive (see Reply7). In the more realistic 2D examples

analysed in this contribution, we only observe wave reflection in the FUNWAVE

simulation with F = 0.25. For this case, “function fitting” provides better results than

“phase fitting” (with and without windowing), showing its usefulness in reflective

conditions. In the real case (Duck beach) we do not observe wave reflection and “phase

fitting with windowing” provides the best results.

Other than the above considerations, we do not see that the slope would affect the

performance of the algorithm. Following the reviewer’s concern, we have added some of

the above ideas to the Discussion in the new version of the manuscript.

(ii) Wave breaking: This goes in the same line. Wave breaking with gentle slopes and low

amplitudes lead to “gentler” breaking. What would happen with steep waves under steep

slopes?

Reply3: In fact, this proposed method is a contribution to the family of methodologies

that are in principle only applicable to the shoaling zone, i.e., is not valid for wave

breaking conditions. It is true that they might also give reasonable estimates of the

bathymetry inside the breaking zone under the “gentler” breaking occurring in

dissipative profiles (in fact, here wave breaking only occurs in the real case of Duck

beach), but this is not the focus of the present contribution. Other methods like that of

Aarninkhof et al. (2003) are aimed to extract the bathymetry inside the breaking area, as

we mentioned in the Introduction. Following the Reviewer’s advice, we have added some

of the above ideas to the Discussion in the new version of the manuscript.

Aarninkhof, S.; Turner, I.; Dronkers, T.; Caljouw, M.; Nipius, L. (2003) A video-based

technique for mapping intertidal beach bathymetry. Coastal Engineering, 49, 275–289.

doi:10.1016/S0378-3839(03)00064-4.

(iii) Wave direction: Figure 8 (bottom,left) show the isobaths parallel to the wave crests. The

wave angles at all experiments are close to perpendicular at the shore. That is normal, because

at such shallow water depths, refraction and shoaling are relevant in the wave propagation.

Additionally, FUNWAVE does not perform well with oblique waves. How would the method

behave under waves oblique to the shoreline? For instance, how would behave this method

under an oblique mature swell?

Reply4: the very small “ripples” (errors in h below 1%) in Figure 8 are equivalent to the

oscillations observed in Figure 3 (monochromatic, phase fitting). They also appear when

the wave is propagating normal to the shore but, in that case, parallel to the shore (this is

the case of the result shown in Figure 3). They disappear when windowing, as it is seen

in Figure 3. Windowing is not used in this Section 3.1 since the goal of this Section is to

analyze the influence of the mesh (see Reply8).

(iv) The effect of tides: The method seems to perform worse at deeper water depths. In a

microtidal environment, such an effect could be considered as a minor issue. However, what

would happen at a macrotidal environment? When would be the best moment for inferring the

bathymetry (i.e. the flood or ebb tide?).

Reply5: indeed, the bathymetry cannot be extracted at depths larger than a certain

threshold (which depends on the period, as explained in Section 4.1) because waves do

not feel the bottom at larger depths. In a macrotidal environment, and for a certain wave

period, applying the method at high tide would allow to retrieve the bathymetry of the

area closer to the dry beach (including part of the intertidal area) and applying it at low

tide would provide the bathymetry of the area further offshore. Following the comment,

we have added a sentence in the new version of the manuscript.

I think that these clarifications would avoid potential misuses of this method by any interested

researcher.

2. Some sentences are long and hard to interpret. For instance, lines 249-251; lines 388-390.

Shorter sentences would enhance readability.

Reply6: We have changed those two sentences and, also, revised the whole document.

3. What is the computational cost of the whole methodology? The authors use the concept of

“high computational cost” (line 460), but it is hard to establish a framework for comparison. What

would be the order of magnitude? Seconds, hours, days? Given the relative long-time between

significant bathymetric changes (be it hours in case of wave extreme events) and the model

update, computational time does not seem a great concern.

Reply7: Assuming that we are recovering the bathymetry using ~10^4 pixels and ~10

minutes at 1Hz, function fitting with windowing takes in the order of a few tens of

minutes while function fitting takes in the order of one day (and, as mentioned in the

manuscript, there can still be convergence problems). We have included this in the new

version of the manuscript.

Specific remarks

line 266: Please justify why not windowing was applied. Shall we apply windowing at all cases?

Reply8: we do so to disaggregate the influence of the different parameters involved in the

algorithm. In this first part of the results (Section 3.1), focused on the influence of the

mesh, we prefer not to introduce the windowing (to avoid a further parameter w_t), which

is analyzed in Section 3.2 (for a fixed mesh). We have modified the manuscript to justify

this point.

Table 6: Despite it is straightforward to compute, please consider to include the wave

amplitudes for the different factors.

Reply9: we have modified the manuscript following the suggestion.

Figure 3: I have reviewed this manuscript with a printed version. The dashed lines can hardly be

seen. May you increase the line width? Same comment for the “blue lines for the exact depth”.

Reply10: Sorry about it, following the suggestion we have increased the line width (to the

same as the continuous lines). Note that the peaks in the function fitting solution have

changed relative to the previous version (we run the optimization again and the errors

appeared at different positions).