# A Preliminary Assessment of a Newly-Defined Multispectral Hue Space for Retrieving River Depth with Optical Imagery and In Situ Calibration Data

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Principles of Remotely-Sensed Estimation of River Bathymetry Using Passive Optical Measurements

^{−2}·sr

^{−1}) over water in a given spectral band with central wavelength $\lambda $ can be split into four components (e.g., Legleiter et al. [8]):

- ${L}_{b}$ is the radiance originating from the Lambertian (diffuse) reflection from the bed substrate;
- ${L}_{c}$ is the volume radiance of the water column of depth h, which is basically sunlight that has been backscattered upwards before reaching the bottom;
- ${L}_{s}$ is the surface radiance due to specular reflections at the air–water interface. ${L}_{s}$ can make up a large fraction of ${L}_{\uparrow}$ for certain geometries or viewing angles (‘sun glints’);
- ${L}_{p}$ is a path radiance due to atmospheric scattering.

^{−2}), while the product ${C}_{0}{T}_{a}$ (in sr

^{−1}) accounts for both atmospheric and air–water interface transmission (${C}_{0}$ can be considered roughly independent of wavelength). These terms are then multiplied by a reflectance term (brackets) which, according to Philpot, can be written as a mixture between two ‘end-members’:

- The first end-member is the reflectance of bed/substrate, ${R}_{b}$, which depends on both location $\mathbf{x}$ and wavelength $\lambda $,
- The second end-member is the reflectance ${R}_{c,\infty}$ that would be measured over an ‘infinitely deep’ water column.

- First, the reflectance ${R}_{c,\infty}$ of deep water is assumed to be small compared to the bed reflectance ${R}_{b}$ (for any substrate and any wavelength). Deep water reflectance is certainly low if the water is clear, but this assumption breaks down in the presence of sediment, dissolved organic matter, algae, etc. Furthermore, the bed might have a very low reflectance. Quartz sand might be highly reflective, but mud and rocks (especially rocks coated with a microbial film) can be quite dark;
- Second, the ratio of bed reflectances at wavelengths ${\lambda}_{i}$ and ${\lambda}_{j}$, $\frac{{R}_{b}(\mathbf{x},{\lambda}_{i})}{{R}_{b}(\mathbf{x},{\lambda}_{j})}$ is approximated as uniform in space.

## 3. Beyond Spectral Ratios: A Definition of Multispectral Hue

#### 3.1. Rationale for the Definition of Multispectral Hue

#### 3.1.1. RGB Case: The Color Wheel

- The pure color can be partially desaturated, i.e., mixed with a white component. Such a mixing can occur for example with specular reflections, which will appear nearly white under white illumination if the refractive index of the material is slowly varying with wavelength (this is the case for water, or for an object coated with varnish).
- The overall brightness or value of the color can be decreased. Physically speaking, this other operation is similar to either reducing the intensity of the light source, or changing the orientation of an object’s surface normals so that less light is diffused towards the sensor (‘shaded face’ effect).

#### 3.1.2. Notations

#### 3.1.3. The Reason Why Multispectral Hue Should Have $(n-2)$ Degrees of Freedom with n Bands

#### 3.2. Multispectral Hue as a Directional Variable

#### 3.2.1. Mathematical Definition

#### 3.2.2. Illustration with the RGB Image

#### 3.2.3. Summary of the Section

- With $n=3$ bands, $\mathbf{U}$ is on ${\mathbb{S}}^{1}$ the unit circle in the Euclidean plane ${\mathbb{R}}^{2}$: it is closely related to the ‘color wheel’;
- With $n=4$ bands, $\mathbf{U}$ is on ${\mathbb{S}}^{2}$ the ‘usual’ unit sphere in the Euclidean space ${\mathbb{R}}^{3}$. It has two degrees of freedom, we might call them ‘color latitude’ and ‘color longitude’ as will be seen later;
- With $n>4$ bands, the hypersphere ${\mathbb{S}}^{n-2}$ cannot be pictured out. However, it is still a smooth Riemannian manifold, and all familiar properties on ${\mathbb{S}}^{1}$ and ${\mathbb{S}}^{2}$ (geodesic distance, Euler angles, rotation group, etc.) can be extended to any dimension.

#### 3.3. Mixture Models for the Analysis of Multispectral Hue Distribution

#### 3.3.1. Mixture Density

#### 3.3.2. Parameter Estimation Using Expectation–Maximization Algorithm

- E-step (Expectation): at each iteration $(r)$, we start with current parameter estimates ${\left\{{\mu}_{j}^{(r)};{\kappa}_{j}^{(r)};{\pi}_{j}^{(r)}\right\}}_{j=1\dots \phantom{\rule{3.33333pt}{0ex}}m}$ and we define updated membership probabilities using Bayes formula:$${\pi}_{ij}^{(r+1)}=\frac{{\pi}_{j}^{(r)}{f}_{j}\left({\phi}_{i}\phantom{\rule{0.166667em}{0ex}}|\phantom{\rule{0.166667em}{0ex}}{\mu}_{j}^{(r)},{\kappa}_{j}^{(r)}\right)}{{\sum}_{{j}^{\prime}=1}^{m}{\pi}_{{j}^{\prime}}^{(r)}{f}_{{j}^{\prime}}\left({\phi}_{i}\phantom{\rule{0.166667em}{0ex}}|\phantom{\rule{0.166667em}{0ex}}{\mu}_{{j}^{\prime}}^{(r)},{\kappa}_{{j}^{\prime}}^{(r)}\right)}$$
- M-step (Maximization): given these membership probabilities, the parameters of each component are re-estimated using a slightly modified maximum likelihood method based on the maximization of:$${\ell}_{j}\left({\mu}_{j},{\kappa}_{j}\right)=\sum _{i=1}^{{N}_{\mathrm{pix}}}{\pi}_{ij}^{(r+1)}log{f}_{j}\left({\phi}_{i}\phantom{\rule{0.166667em}{0ex}}|\phantom{\rule{0.166667em}{0ex}}{\mu}_{j},{\kappa}_{j}\right)$$$${\pi}_{j}^{(r+1)}=\sum _{i=1}^{{N}_{\mathrm{pix}}}{\pi}_{ij}^{(r+1)}$$

## 4. Using Multispectral Hue as a Predictor for Depth

#### 4.1. Dataset

#### 4.1.1. High-Resolution Airborne Imagery

^{®}database provided by the IGN (French National Geographic Institute). This nationwide dataset consists of RGB and Infrared Color (IRC) ortho-rectified and radiometrically corrected airborne images [24,25] with a resolution of 0.50 m. For our area of interest, we use images acquired during the period 17–30 July, 2016. This dataset provides us with four coregistered spectral bands, namely near-infrared (NIR), red, green, and blue.

#### 4.1.2. In Situ Depth Measurements

#### 4.1.3. LiDAR Data in the Floodplain

^{®}database also provided by IGN. Even though raster tiles are without gaps, elevation data in the river channel must be ignored: the value at a pixel located in the channel is simply the result of an interpolation between nearby bank pixels. In order to filter out interpolated pixels, the BDORTHO

^{®}elevation rasters come with useful ancillary DST (dist) rasters, which indicate the distance of each pixel to the nearest valid LiDAR point (see Figure 8 right).

#### 4.2. Hue Sphere ${\mathbb{S}}^{2}$ for $n=4$ Bands

#### 4.3. Masks

- Riparian vegetation as well as floating algae are masked using the classical Normalized Difference Vegetation Index (NDVI) [26], which indicates high chlorophyll content. In practice, the rather low threshold selected (NDVI > −0.3) excludes more than just vegetation pixels: all surfaces with any non-negligible reflectance in the IR will be masked;
- Very dark areas are thoroughly removed: they mainly consist of water pixels in the shadow of riparian vegetation; as these pixels are difficult to characterize spectrally (light has traveled through both leaves and water), the criterion used is simply the mean of the four bands;
- A filter is designed to remove whitewater/white wakes in riffle areas: as these pixels appear in light gray in the visible bands (almost hueless in RGB) but with substantial absorption in the IR (typical digital counts are [0.2 0.6 0.6 0.6]), they are removed on the basis of their low saturation and high brightness in the RGB space along with a threshold in the NIR (${C}_{\mathrm{NIR}}<0.3$). Note that they have nothing to do with specular reflections: the surface of water appears white there for any viewing angle, because of the scattering from the bubbles/air pockets, not from surface reflection;
- A mask for man-made features crossing the river, such as bridges or power lines as well as their casted shadows, is built manually (some of these pixels are already masked by the previous filters, but not all).

#### 4.4. Hue–Depth Visual Correlation on ${\mathbb{S}}^{2}$

#### 4.5. Mixture Models on ${\mathbb{S}}^{2}$ and Higher Dimension, and Depth Predictor

- We assume that the probability density function of multispectral hue in river pixels can be modeled by a two-component mixture density: the first component will represent the statistical distribution of substrate hue, while the second component will represent the statistical distribution of ‘deep water’ hue;
- We will estimate the parameters of the two components so that the membership probability to ‘deep’ component correlates with depth: the probability ${\pi}_{i,\mathrm{deep}}$ that pixel i with hue ${\mathbf{U}}_{\mathbf{i}}$ comes from the ‘deep’ component will be the predictor for depth ${h}_{i}$ at pixel i.

#### 4.6. Modified EM Algorithm for Depth Estimation

- E-step (Expectation): given current parameter estimates $\left\{{\pi}_{\mathrm{deep}}^{(r)};{\mathbf{\Theta}}_{\mathbf{deep}}^{(r)};{\mathbf{\Theta}}_{\mathbf{bed}}^{(r)}\right\}$ of the mixture at iteration $(r)$, compute the updated probability that pixel i with multispectral hue ${U}_{i}$ comes from a deep component:$${\pi}_{i,\mathrm{deep}}^{(r+1)}=\frac{{\pi}_{\mathrm{deep}}^{(r)}{f}_{\mathrm{deep}}\phantom{\rule{-0.166667em}{0ex}}\left({\mathbf{U}}_{\mathbf{i}}\phantom{\rule{0.166667em}{0ex}}|\phantom{\rule{0.166667em}{0ex}}{\mathbf{\Theta}}_{\mathbf{deep}}^{(r)}\right)}{{\pi}_{\mathrm{deep}}^{(r)}{f}_{\mathrm{deep}}\phantom{\rule{-0.166667em}{0ex}}\left({\mathbf{U}}_{\mathbf{i}}\phantom{\rule{0.166667em}{0ex}}|\phantom{\rule{0.166667em}{0ex}}{\mathbf{\Theta}}_{\mathbf{deep}}^{(r)}\right)+\left(1-{\pi}_{\mathrm{deep}}^{(r)}\right){f}_{\mathrm{bed}}\phantom{\rule{-0.166667em}{0ex}}\left({\mathbf{U}}_{\mathbf{i}}\phantom{\rule{0.166667em}{0ex}}|\phantom{\rule{0.166667em}{0ex}}{\mathbf{\Theta}}_{\mathbf{bed}}^{(r)}\right)}$$
- R-step (Regression): regress deep membership probability as a power law function of measured depth so that$${\widehat{\pi}}_{i,\mathrm{deep}}^{(r+1)}={a}^{(r+1)}{\left({h}_{i}\right)}^{{b}^{(r+1)}}$$
- M-step (Maximization): independently maximize the log-likelihood for each component$$\left\{\begin{array}{cc}{\displaystyle {\ell}_{\mathrm{deep}}\phantom{\rule{-0.166667em}{0ex}}\left({\mathbf{\Theta}}_{\mathbf{deep}}\right)=\sum _{i=1}^{{N}_{\mathrm{depth}}}{\widehat{\pi}}_{i,\mathrm{deep}}^{(r+1)}log{f}_{\mathrm{deep}}\phantom{\rule{-0.166667em}{0ex}}\left({\mathbf{U}}_{\mathbf{i}}\phantom{\rule{0.166667em}{0ex}}|\phantom{\rule{0.166667em}{0ex}}{\mathbf{\Theta}}_{\mathbf{deep}}\right)}\hfill & \u27f9\mathrm{updated}{\mathbf{\Theta}}_{\mathbf{deep}}^{(r+1)}\hfill \\ & \\ {\displaystyle {\ell}_{\mathrm{bed}}\phantom{\rule{-0.166667em}{0ex}}\left({\mathbf{\Theta}}_{\mathbf{bed}}\right)=\sum _{i=1}^{{N}_{\mathrm{depth}}}\left(1-{\widehat{\pi}}_{i,\mathrm{deep}}^{(r+1)}\right)log{f}_{\mathrm{bed}}\phantom{\rule{-0.166667em}{0ex}}\left({\mathbf{U}}_{\mathbf{i}}\phantom{\rule{0.166667em}{0ex}}|\phantom{\rule{0.166667em}{0ex}}{\mathbf{\Theta}}_{\mathbf{bed}}\right)}\hfill & \u27f9\mathrm{updated}{\mathbf{\Theta}}_{\mathbf{bed}}^{(r+1)}\hfill \end{array}\right.$$$${\pi}_{\mathrm{deep}}^{(r+1)}=\frac{1}{{N}_{\mathrm{depth}}}\sum _{i=1}^{{N}_{\mathrm{depth}}}{\widehat{\pi}}_{i,\mathrm{deep}}^{(r+1)}$$

## 5. Results

#### 5.1. Error Statistics

#### 5.2. Focus on Some Cross-Sections

## 6. Discussion

#### 6.1. Sources of Error

#### 6.1.1. Radiometric Inhomogeneity

^{®}images used in this study form a nationwide dataset covering all 101 French departments. In practice, these images are radiometrically equalized in order to provide a seamless mosaic at the scale of each department [24,25]. There remain some variations in radiometric aspects inside the mosaic, and especially at the border between two departments. As the investigated reach spans two departments, namely Haute-Garonne (31) and Tarn-et-Garonne (82), the dataset is clearly not radiometrically homogeneous, and this increases the variability in both substrate and deep water hue. This additional variability reduces the precision of the hue—depth relation, but since the method is designed to deal with a dispersion in the distribution of both components through parameter sets ${\mathbf{\Theta}}_{\mathbf{bed}}$ and ${\mathbf{\Theta}}_{\mathbf{deep}}$ of the Fisher–Bingham–Kent distributions, the radiometric inhomogeneity can be statistically treated together with the natural variations. The key parameter acting as a quantification of non-uniformity is the concentration parameter ${\kappa}_{\mathrm{bed}}$ of the bed hue distribution. This is an important advantage of the method over log-ratio methods, which rely on the assumption that reflectance ratios $\frac{{R}_{b}(\mathbf{x},{\lambda}_{i})}{{R}_{b}(\mathbf{x},{\lambda}_{j})}$ are rather uniform spatially (see Section 2).

#### 6.1.2. Age and Planar Precision of In Situ Data

#### 6.1.3. Water Surface Elevation

- The first estimate was simply given by the lowest valid elevation at each cross-section in the LiDAR DEM of the floodplain (which also covers the channel through interpolation between the banks). Indeed, the LiDAR survey was conducted during the 2012 low flow period, in hydraulic conditions that we assumed roughly similar to those of the 2016 images;
- The second estimate is the elevation that yields the same width as the one observed in the 2016 images (as defined by the NDWI mask), for each surveyed profile. We then interpolate linearly between the surveyed cross-sections.

#### 6.2. Perspectives

## Author Contributions

## Funding

## Data Availability Statement

^{®}dataset used in this study are publicly available from IGN Open-Access portal (accessed 14 December 2020): https://geoservices.ign.fr/documentation/diffusion/telechargement-donnees-libres.html.

## Acknowledgments

## Conflicts of Interest

## Appendix A. General Expression of the Rotation Matrix Needed for Computing the Multispectral Hue in Any Dimension

## Appendix B. Computational Example of the Mapping from ${\mathbb{R}}^{\mathbf{4}}$ to ${\mathbb{S}}^{\mathbf{2}}$ with Four Bands

- We first remove the mean of all bands to each band in order to obtain the vector ${\mathbf{c}}^{\prime}$, and then we normalize it:$${\mathbf{c}}_{\mathrm{Green}}^{\prime}=\left[\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}\begin{array}{c}-1/4\\ -1/4\\ +3/4\\ -1/4\end{array}\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}\right]\phantom{\rule{1.em}{0ex}}\u27f9\phantom{\rule{1.em}{0ex}}\frac{{\mathbf{c}}_{\mathrm{Green}}^{\prime}}{\u2225{\mathbf{c}}_{\mathrm{Green}}^{\prime}\u2225}=\frac{2}{\sqrt{3}}\left[\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}\begin{array}{c}-1/4\\ -1/4\\ +3/4\\ -1/4\end{array}\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}\right]=\frac{1}{2\sqrt{3}}\left[\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}\begin{array}{c}-1\\ -1\\ +3\\ -1\end{array}\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}\right]$$
- The latter vector is already on a two-sphere, but this sphere is embedded in a three-dimensional subspace of ${\mathbb{R}}^{4}$ whose axes do not coincide with a triplet of axes of the initial basis (this three-dimensional subspace is orthogonal to the unit ‘white’ vector $\mathbf{w}=\frac{1}{2}{[1,1,1,1]}^{T}$). In order to drop one Cartesian coordinate (e.g., the last one), we rotate $\mathbf{w}$ to the ’North Pole’ $\mathbf{n}={[0,0,0,1]}^{T}$ of ${\mathbb{R}}^{4}$, with the 4D rotation matrix $\mathbf{M}$ such that:$$\mathbf{M}\phantom{\rule{4pt}{0ex}}\mathbf{w}=\mathbf{M}\left[\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}\begin{array}{c}1/2\\ 1/2\\ 1/2\\ 1/2\end{array}\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}\right]=\left[\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}\begin{array}{c}0\\ 0\\ 0\\ 1\end{array}\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}\right]=\mathbf{n}$$Performing this final rotation and discarding the last, the zero coordinate yields:$${\mathbf{U}}_{\mathrm{Green}}=\mathbf{M}\phantom{\rule{0.166667em}{0ex}}\frac{{\mathbf{c}}_{\mathrm{Green}}^{\prime}}{\u2225{\mathbf{c}}_{\mathrm{Green}}^{\prime}\u2225}=\frac{1}{12\sqrt{3}}\left[\begin{array}{c}\hfill -4\\ \hfill -4\\ \hfill 20\\ \hfill 0\end{array}\right]\sim \frac{1}{3\sqrt{3}}\left[\begin{array}{c}\hfill -1\\ \hfill -1\\ \hfill 5\end{array}\right]\in {\mathbb{R}}^{3}$$
- Here we express $\mathbf{U}$ in Cartesian coordinates of ${\mathbb{R}}^{n-1}$, but we can check that it is indeed in ${\mathbb{S}}^{n-2}$:$$\u2225{\mathbf{U}}_{\mathrm{Green}}\u2225=\frac{1}{3\sqrt{3}}\sqrt{{(-1)}^{2}+{(-1)}^{2}+{5}^{2}}=\frac{1}{3\sqrt{3}}\sqrt{27}\phantom{\rule{4pt}{0ex}}=1$$

## Appendix C. Overview of the Fisher–Bingham–Kent (FBK) Distribution

- Degrees of freedom corresponding to the minimum number of rotations (aside from degenerated cases) allowing the change-of-basis from the canonical basis of ${\mathbb{R}}^{p}$ to the orthonormal basis defined by the $\left\{{\mathbf{v}}_{\mathbf{j}}\right\}$; there are as many such rotations as independent rotation planes in ${\mathbb{R}}^{p}$, that is, the number of combinations of two orthogonal directions chosen from among p:$$\left(\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}\begin{array}{c}p\\ 2\end{array}\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}\right)={C}_{p}^{2}=\frac{p!}{(p-2)!\phantom{\rule{0.166667em}{0ex}}2!}=\frac{1}{2}p(p-1)$$
- p degrees of freedom corresponding to the concentration parameter $\kappa $ and the $(p-1)$ asymetry parameters ${\beta}_{j}$, $2\le j\le p$,
- 1 degree of freedom removed due to the condition on the ${\beta}_{j}$$$\sum _{j=2}^{p}{\beta}_{j}=0$$

**Table A1.**Number N of degrees of freedom for the Fisher–Bingham–Kent distribution as a function of the dimension p of the initial space.

n Number of Bands | $\mathit{p}=\mathit{n}-1$ | Euclidean Space | Hypersphere | $\mathit{N}=\frac{(\mathit{p}-1)(\mathit{p}+2)}{2}$ | Remark |
---|---|---|---|---|---|

3 | 2 | ${\mathbb{R}}^{2}$ | ${\mathbb{S}}^{1}$ (circle) | 2 | Von Mises distribution |

4 | 3 | ${\mathbb{R}}^{3}$ | ${\mathbb{S}}^{2}$ (sphere) | 5 | classical Fisher–Bingham “FB5” |

5 | 4 | ${\mathbb{R}}^{4}$ | ${\mathbb{S}}^{3}$ | 9 | |

10 | 9 | ${\mathbb{R}}^{9}$ | ${\mathbb{S}}^{8}$ | 44 | |

50 | 49 | ${\mathbb{R}}^{49}$ | ${\mathbb{S}}^{48}$ | 1224 | |

100 | 99 | ${\mathbb{R}}^{99}$ | ${\mathbb{S}}^{98}$ | 4949 |

## References

- Wyrick, J.; Pasternack, G. Geospatial organization of fluvial landforms in a gravel–cobble river: Beyond the riffle–pool couplet. Geomorphology
**2014**, 213, 48–65. [Google Scholar] [CrossRef] [Green Version] - Mahdade, M.; Le Moine, N.; Moussa, R.; Navratil, O.; Ribstein, P. Automatic identification of alternating morphological units in river channels using wavelet analysis and ridge extraction. Hydrol. Earth Syst. Sci.
**2020**, 24, 3513–3537. [Google Scholar] [CrossRef] - Kinzel, P.J.; Wright, C.W.; Nelson, J.M.; Burman, A.R. Evaluation of an Experimental LiDAR for Surveying a Shallow, Braided, Sand-Bedded River. J. Hydraul. Eng.
**2007**, 133, 838–842. [Google Scholar] [CrossRef] [Green Version] - Hilldale, R.C.; Raff, D. Assessing the ability of airborne LiDAR to map river bathymetry. Earth Surf. Process. Landforms
**2008**, 33, 773–783. [Google Scholar] [CrossRef] - Nakao, L.; Krueger, C.; Bleninger, T. Benchmarking for using an acoustic Doppler current profiler for bathymetric survey. Environ. Monit. Assess.
**2021**, 193, 356. [Google Scholar] [CrossRef] [PubMed] - Lyzenga, D.R. Passive remote sensing techniques for mapping water depth and bottom features. Appl. Opt.
**1978**, 17, 379–383. [Google Scholar] [CrossRef] [PubMed] - Marcus, W.A.; Fonstad, M.A. Optical remote mapping of rivers at sub-meter resolutions and watershed extents. Earth Surf. Process. Landforms
**2008**, 33, 4–24. [Google Scholar] [CrossRef] - Legleiter, C.J.; Roberts, D.A.; Lawrence, R.L. Spectrally based remote sensing of river bathymetry. Earth Surf. Process. Landforms
**2009**, 34, 1039–1059. [Google Scholar] [CrossRef] - Campbell, J.B. Introduction to Remote Sensing, 2nd ed.; The Guilford Press: New York, NY, USA, 1996. [Google Scholar]
- Philpot, W.D. Bathymetric mapping with passive multispectral imagery. Appl. Opt.
**1989**, 28, 1569–1578. [Google Scholar] [CrossRef] - Shah, A.; Deshmukh, B.; Sinha, L. A review of approaches for water depth estimation with multispectral data. World Water Policy
**2020**, 6, 152–167. [Google Scholar] [CrossRef] - König, M.; Oppelt, N. A linear model to derive melt pond depth from hyperspectral data. Cryosph. Discuss
**2019**, 2019, 1–17. [Google Scholar] - König, M.; Birnbaum, G.; Oppelt, N. Mapping the Bathymetry of Melt Ponds on Arctic Sea Ice Using Hyperspectral Imagery. Remote Sens.
**2020**, 12, 2623. [Google Scholar] [CrossRef] - Doxani, G.; Papadopoulou, M.; Lafazani, P.; Pikridas, C.; Tsakiri-Strati, M. Shallow-water bathymetry over variable bottom types using multispectral Worldview-2 image. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.
**2012**, 39, 159–164. [Google Scholar] [CrossRef] [Green Version] - Niroumand-Jadidi, M.; Vitti, A.; Lyzenga, D.R. Multiple Optimal Depth Predictors Analysis (MODPA) for river bathymetry: Findings from spectroradiometry, simulations, and satellite imagery. Remote Sens. Environ.
**2018**, 218, 132–147. [Google Scholar] [CrossRef] - Montoliu, R.; Pla, F.; Klaren, A.C. Illumination Intensity, Object Geometry and Highlights Invariance in Multispectral Imaging. In Proceedings of the Second Iberian Conference on Pattern Recognition and Image Analysis—Volume Part I; IbPRIA’05. Springer: Berlin/Heidelberg, Germany, 2005; pp. 36–43. [Google Scholar] [CrossRef] [Green Version]
- Kay, S.; Hedley, J.D.; Lavender, S. Sun Glint Correction of High and Low Spatial Resolution Images of Aquatic Scenes: A Review of Methods for Visible and Near-Infrared Wavelengths. Remote Sens.
**2009**, 1, 697–730. [Google Scholar] [CrossRef] [Green Version] - Dempster, A.P.; Laird, N.M.; Rubin, D.B. Maximum Likelihood from Incomplete Data via the EM Algorithm. J. R. Stat. Soc. Ser. B Methodol.
**1977**, 39, 1–38. [Google Scholar] - Jantzi, H.; Carozza, J.M.; Probst, J.L. Les formes d’érosion en lit mineur rocheux: Typologie, distribution spatiale et implications sur la dynamique du lit. Exemple à partir des seuils rocheux molassiques de la moyenne Garonne toulousaine (Sud-Ouest, France). Géomorphologie
**2020**, 26, 79–96. [Google Scholar] [CrossRef] - Garambois, P.A.; Biancamaria, S.; Monnier, J.; Roux, H.; Dartus, D. Variationnal data assimilation of AirSWOT and SWOT data into the 2D shallow water model Dassflow, method and test case on the Garonne river (France). In Proceedings of the 20 Years of Progress in Radar Altimetry, Venice, Italy, 24–29 September 2012. [Google Scholar]
- Oubanas, H.; Gejadze, I.; Malaterre, P.O.; Mercier, F. River discharge estimation from synthetic SWOT-type observations using variational data assimilation and the full Saint-Venant hydraulic model. J. Hydrol.
**2018**, 559, 638–647. [Google Scholar] [CrossRef] - Goutal, N.; Goeury, C.; Ata, R.; Ricci, S.; Mocayd, N.E.; Rochoux, M.; Oubanas, H.; Gejadze, I.; Malaterre, P.O. Uncertainty Quantification for River Flow Simulation Applied to a Real Test Case: The Garonne Valley. In Advances in Hydroinformatics; Gourbesville, P., Cunge, J., Caignaert, G., Eds.; Springer: Singapore, 2018; pp. 169–187. [Google Scholar]
- Legleiter, C.; Overstreet, B. Hyperspectral image data and field measurements used for bathymetric mapping of the Snake River in Grand Teton National Park, WY. U.S. Geological Survey Data Release, 2018 (accessed 2021-11-02). [CrossRef]
- Institut Géographique National. BD ORTHO
^{®}Version 2.0/ORTHO HR^{®}Version 1.0: Descriptif de Contenu, May 2013, Updated July 2018. Available online: https://geoservices.ign.fr/ressources_documentaires/Espace_documentaire/ORTHO_IMAGES/BDORTHO_ORTHOHR/DC_BDORTHO_2-0_ORTHOHR_1-0.pdf (accessed on 14 December 2020). - Chandelier, L.; Martinoty, G. Radiometric aerial triangulation for the equalization of digital aerial images and orthoimages. Photogramm. Eng. Remote Sens
**2009**, 75, 193–200. [Google Scholar] [CrossRef] - Kriegler, F.; Malila, W.; Nalepka, R.; Richardson, W. Preprocessing transformations and their effects on multispectral recognition. Remote Sens. Environ.
**1969**, 6, 97. [Google Scholar] - Kent, J.T. The Fisher-Bingham Distribution on the Sphere. J. R. Stat. Soc. Ser. B Methodol.
**1982**, 44, 71–80. [Google Scholar] [CrossRef] - Kume, A.; Wood, A.T.A. Saddlepoint Approximations for the Bingham and Fisher-Bingham Normalising Constants. Biometrika
**2005**, 92, 465–476. [Google Scholar] [CrossRef] [Green Version] - Kume, A.; Preston, S.P.; Wood, A.T.A. Saddlepoint approximations for the normalizing constant of Fisher-Bingham distributions on products of spheres and Stiefel manifolds. Biometrika
**2013**, 100, 971–984. [Google Scholar] [CrossRef] - Amaral, G.J.A.; Dryden, I.L.; Wood, A.T.A. Pivotal Bootstrap Methods for k-Sample Problems in Directional Statistics and Shape Analysis. J. Am. Stat. Assoc.
**2007**, 102, 695–707. [Google Scholar] [CrossRef]

**Figure 1.**Decomposition of total measured radiance over water (adapted from Campbell, 1996 [9]). ${E}_{\downarrow}$ is the downwelling solar irradiance (W·m

^{−2}), ${L}_{b}$, ${L}_{c}$, ${L}_{s}$ and ${L}_{p}$ are the bed, water column, surface and path radiances, respectively (W·m

^{−2}·sr

^{−1}).

**Figure 2.**Screenshot of the color wheel tool in QGIS. The hue or ‘pure color’ is selected along the circle with a single angular degree of freedom (DoF). The two other operations that can be performed are (1) desaturation or (2) decrease in brightness/value.

**Figure 3.**Katell’s wooden toys under natural sunlight. Note the specular reflection on the front edge of the orange cube, for example, which appears almost white.

**Figure 4.**Spectral invariant proposed by Montoliu et al., computed on the image of Figure 3. (

**Left panel**): location of transformed pixel values in the RGB unit cube, invariant ${\mathbf{L}}^{(M)}$ is constrained in a plane orthogonal to the white vector $\mathbf{w}={[1,1,1]}^{T}$. (

**Right panel**): transformed RGB image. Note how light reflected from the yellow oval toy produces a yellow hue on the white floor and also alters the hue on the rightmost side of the purple pentagon toy.

**Figure 5.**Illustration of multispectral hue $\mathbf{U}$ computed on the RGB image of Figure 3. (

**Left panel**): location of transformed pixel values on a great circle in a plane orthogonal to white vector $\mathbf{w}={[1,1,1]}^{T}$. (

**Right panel**): transformed RGB image.

**Figure 6.**(

**Left panel**) Hue distribution on the ‘wooden toys’ image (the density is the black dashed line), and its modeling with a mixture of 5 Von Mises distributions. We can check that ${\pi}_{\mathrm{orange}}+{\pi}_{\mathrm{yellow}}+{\pi}_{\mathrm{green}}+{\pi}_{\mathrm{blue}}+{\pi}_{\mathrm{purple}}=1$. (

**Right panel**) shows the posterior membership probability ${\pi}_{ij}$ (given by Bayes formula) that pixel i comes from component j.

**Figure 7.**Location of the reach of the Garonne river investigated in this study. (

**a**) Location of the reach in the whole Garonne catchment, between Toulouse and Verdun-sur-Garonne; (

**b**) in situ cross-section data used for calibration.

**Figure 8.**Sample of the IGN BDORTHO

^{®}database covering the floodplain. (

**a**) Elevation raster at 1-meter planar resolution; (

**b**) DST raster indicating the distance to the nearest valid LiDAR point for each pixel.

**Figure 9.**Representation of multispectral hue on the sphere ${\mathbb{S}}^{2}$ for two terrain patches: a patch of water pixels (A) and a patch of vegetation pixels (B). Due to the low absorption in the visible bands and high absorption in the IR for water, and the opposite behavior for chlorophyll, the two patches map at the antipode of each other (close to the anti-NIR hue for water, and close to pure NIR hue for vegetation). The image displayed on the right is a NIR-R-G composite.

**Figure 10.**Illustration of the four masks applied to multispectral images in order to retain only river pixels. The image displayed is a NIR-R-G composite.

**Figure 11.**Representation of multispectral hue on ${\mathbb{S}}^{2}$ for the pixels located at depth measurement points. As depth increases, hue tends to concentrate near a single pole corresponding to the hue of a very deep water column, denoted ${\mathbf{U}}_{\mathbf{c},\infty}$.

**Figure 12.**The probability density function $f(\mathbf{U})$ of a hue distribution on the sphere (

**left**) can be modeled with a mixture of two or more Fisher–Bingham–Kent distributions (

**right**). For the distribution of hue over river pixels, we used two components denoted ${f}_{\mathrm{bed}}$ and ${f}_{\mathrm{deep}}$ with respective parameters sets ${\mathbf{\Theta}}_{\mathbf{bed}}$ and ${\mathbf{\Theta}}_{\mathbf{deep}}$: ${\mathbf{v}}_{\mathbf{1},\mathbf{bed}}$ and ${\mathbf{v}}_{\mathbf{1},\mathbf{deep}}$ to denote the mean direction of each component.

**Figure 13.**(

**Left panel**): ‘depth’ vs. ‘deep membership probability’ regression as outputted by the modified expectation–maximization. This relation is of course meant to be used in the reverse direction in predictive mode (

**right panel**): given the posterior membership probability ${\pi}_{i,\mathrm{deep}}$ computed at pixel i with multispectral hue ${\mathbf{U}}_{\mathbf{i}}$, we can estimate depth ${h}_{i}$ at this pixel.

**Figure 14.**Correlation between measured depth and the log of the six different spectral ratios that can be defined with four-band imagery.

**Figure 15.**Result of a Multiple Linear (ML) regression against all three independent log-ratios. Negative predicted values have been set to zero before the computation of RMSE and ${r}^{2}$.

**Figure 16.**Comparison between measured and simulated depth along two surveyed cross-sections. The method is able to capture high-frequency variations in water depth: a typical example is the crescent-shaped feature visible at the bottom of the image, which is intersected by profile BLA06.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Le Moine, N.; Mahdade, M.
A Preliminary Assessment of a Newly-Defined Multispectral Hue Space for Retrieving River Depth with Optical Imagery and In Situ Calibration Data. *Remote Sens.* **2021**, *13*, 4435.
https://doi.org/10.3390/rs13214435

**AMA Style**

Le Moine N, Mahdade M.
A Preliminary Assessment of a Newly-Defined Multispectral Hue Space for Retrieving River Depth with Optical Imagery and In Situ Calibration Data. *Remote Sensing*. 2021; 13(21):4435.
https://doi.org/10.3390/rs13214435

**Chicago/Turabian Style**

Le Moine, Nicolas, and Mounir Mahdade.
2021. "A Preliminary Assessment of a Newly-Defined Multispectral Hue Space for Retrieving River Depth with Optical Imagery and In Situ Calibration Data" *Remote Sensing* 13, no. 21: 4435.
https://doi.org/10.3390/rs13214435