The Use of Ultra High Resolution UAV Lidar Infrared Intensity for Enhancing Coastal Cover Classification

: Coastal areas gather increasing hazard, exposure and vulnerability in the context of an-thropocenic changes. The understanding of their spatial responses to acute and chronic drivers require ultra high spatial resolution that can only be achieved by UAV-based sensors. UAV lasergram-metry constitues, to date, the best observation of the xyz variables in terms of resolution, precision and accuracy, allowing coastal areas to be reliably mapped. However, the use of lidar reflectivity (or intensity) remains poorly examined for mapping purposes. The added value of the lidar-derived near-infrared (NIR) was estimated by comparing the classification results of nine


Introduction
Coastal areas play a key role in the adaptation of ocean-climate change due to their land-sea interface [1].The mapping and monitoring of their use and cover are crucial to understand where are located the most exposed and vulnerable zones and how to manage them in a sustainable way [2].The finest spatial resolution possible is required to empower the diagnosis and prognosis of coastal objects subject to current and future erosion and/or submersion risks.To date, unmanned aerial vehicles (UAVs) consist of the best platforms to bear sensors capable to provide centimeter-scale 2D and 3D coastal information [3].The active lidar instrument scans coastal landscapes with a rate of hundreds of thousands points per second propagating at the speed of light [4].UAV-based lidar products enable to reach the best accuracy and precision in xyz data among the airborne/spaceborne tools.However lidar intensity remains poorly harnessed in Earth Observation from satellite to drone, despite its obvious added value in terms of spectral information [5].
This study aims to assess the contribution of the UAV-based lidar-derived near-infrared (NIR) intensity in the overall accuracy (OA) and kappa coefficient (κ) of the classification of a coastal landscape, provided with nine representative natural, semi-natural and anthropogenic habitats.The lidar NIR contribution is quantified in the light of a bluegreen-red (BGR) passive imagery, whose camera is co-located with the lidar sensor.

Study Site
The study site is located along the bay of Mont-Saint-Michel, midway between the most extended salt marshes in northern France and rural polders (Figure 1).This site was selected based on the diversity of the habitats, namely salt marsh, grass, dry grass, shrub, tree, soil, sediment, road, and car (Table 1).Every class was represented by 4 600 pixels, split into 2 300 calibration and 2 300 validation pixels.Both sub-datasets were spatially disjointed to avoid spatial autocorrelation.A total of 41 400 pixels were therefore used for, first, training the probabilistic maximum likelihood learner, then for testing its predictability.

Drone Lidar Flight
The lidar drone mission was realized on 5 June 2023 using a DJI Zenmuse L1 sensor mounted on a DJI Matrice 300 RTK quadcopter, linked with a DJI D-RTK2 high precision Global Navigation Satellite System (GNSS) station base.The flight mission followed these navigational parameters: 50 m height, 4 m/s speed, 12 min time, 2.04 km path length, 0.30 km 2 , 233 BGR pictures, 0.013 m ground sample distance.
The Zenmuse L1 sensor is designed with a 905-nm Livox Avia laser, a 200-Hz inertial measurement unit and a 1-inch RGB camera (20 Mp), all mounted on a 3-axis gimbal provided with a DJI Skyport, enabling the synchronization of the lidar RTK positioning with the Matrice 300 RTK system.The point sampling rate was fixed at 240 kHz in the dual return mode, and the line scanning pattern was selected (repetitive field-of-view : 70.4° horizontal × 4.5° vertical).The lidar mission followed these specific parameters: 80% front overlapping, 70% side overlapping, average density point of 2 477 points/m 2 .The DJI native (but proprietary) lidar format was implemented into DJI Terra to get the .lasformat in the local datum RGF93, projected in Lambert 93, along the IGN69 altimetry.The mean NIR intensity was rasterized at 0.01 m from the resulting point cloud (Figure 2).

Landscape Scale
The OA and κ were derived from the confusion matrices established from the validation datasets of the BGR (Table 2) and BGR-NIR (Table 3) classifications.OA and κ reached 84.57% and 0.8264 for the BGR, and 88.71% and 0.8730 for the BGR-NIR datasets, respectively (Figure 3).

Habitat Scale
Regarding the producer's accuracy (PA), the habitats that most benefited from the NIR addition were "road", "grass" and "soil", whereas "tree" lost a little detection.
The consistent augmentation for "salt marsh" and "soil" might be explained by the higher and lower reflectance in the NIR spectrum, respectively.High salt marsh vegetation, such as Puccinellia, Festuca, Aster, Limione or Elymus genera, displays a tangible higher NIR reflectance in the summer season [6], while the "soil" investigated here corresponded to the transitional wet-to-dry area just above a pond, thus the lower NIR reflectance due to the moisture.

Conclusions
The contribution of the UAV-borne lidar-derived NIR intensity to the classification of a coastal landscape (provided with nine representative habitats) was evaluated by comparing OA, PA and UA results associated with a passive BGR dataset and a combination of a passive-active BGR-NIR dataset using a probabilistic maximum likelihood classifier.At the landscape level, the addition of the lidar NIR intensity to the BGR reference increased OA by 4.14%.At the habitat level, "salt marsh" and "soil" gained 4 and 8.95% in PA, respectively, and 4.56 and 9.48% in UA, respectively.It is therefore recommended to add the lidar-derived intensity into classification when front and side overlaps at least reach 80 and 70%, respectively.

Citation:
To be added by editorial staff during production.Academic Editor: Firstname Lastname Published: date Copyright: © 2023 by the authors.Submitted for possible open access publication under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/license s/by/4.0/).

Table 1 .
Habitat name, description and blue-green-red derived thumbnails.

Table
Confusion matrix derived from the blue-green-red classification.

Table 3 .
Confusion matrix derived from the blue-green-red + lidar-derived near-infrared classification.

Table 4 .
Results of the Producer's Accuracy and User's Accuracy differences between BGR and BGR-NIR classifications.