Next Article in Journal
Towards Pseudo-Labeling with Dynamic Thresholds for Cross-View Image Geolocalization
Previous Article in Journal
Estimating the Aboveground Biomass of Shrubland and Savanna Ecosystems Using High-Resolution Small UAV Systems: A Systematic Review
Previous Article in Special Issue
HyperspectralMamba: A Novel State Space Model Architecture for Hyperspectral Image Classification
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

HySIMU: An Open-Source Toolkit for Hyperspectral Remote Sensing Forward Modelling

Department of Geological Sciences and Geological Engineering, Queen’s University, Kingston, ON K7L3N6, Canada
*
Author to whom correspondence should be addressed.
Remote Sens. 2026, 18(6), 943; https://doi.org/10.3390/rs18060943
Submission received: 6 February 2026 / Revised: 16 March 2026 / Accepted: 17 March 2026 / Published: 20 March 2026

Highlights

What are the main findings?
  • HySIMU, an open-source and modular Python-based hyperspectral remote sensing forward modelling starter toolkit that integrates various non-proprietary modules and libraries into an established processing workflow.
What are the implications of the main findings?
  • HySIMU enables the community to perform sensitivity tests for various survey scenarios to optimize mission planning and design.
  • HySIMU provides synthetic image generation capabilities to support and enhance image-processing algorithm development.

Abstract

Hyperspectral remote sensing (HRS) is gaining widespread adoption within the geoscience and Earth observation communities. It fosters diverse applications, including precision agriculture, soil science, mineral exploration, and carbon detection, to name a few. Recent technological advancements facilitated a growing number of satellite missions as well as an increase in the availability of commercial sensors and platforms, such as drones. A significant challenge in deploying the varied platforms and sensors is the design and optimization of the hyperspectral surveys. Forward modelling simulators are valuable for optimizing mission parameters and estimating imaging performance. Limited accessibility of open-source simulators presents an obstacle for users who seek to benefit from such tools. To bridge this gap, HySIMU (Hyperspectral SIMUlator) was developed and described herein. It is an open-source, forward modelling toolkit that combines and integrates a primary processing pipeline with various open-source packages into a transparent and modular workflow. It offers a cost-effective approach to evaluating the performance of hyperspectral surveys. HySIMU is designed to simulate hyperspectral imagery based on user-defined targets, platforms, and sensor parameters. Features include (i) a ground truth data cube builder for customizable input parameters, (ii) a terrain-based solar and view geometry calculator for illumination modelling, (iii) integrated open-source radiative transfer models for incorporating atmospheric effects, and (iv) spatial resampling filters. In this manuscript, the initial framework for HySIMU is presented with some example applications, including two validation studies with real hyperspectral images. As remote sensing technologies advance, forward modelling toolkits such as HySIMU play a crucial role in refining mission designs and assessing survey feasibility. The scalability for arbitrary hyperspectral sensors, platforms, and spectral libraries ensures broad applicability. Of particular importance is support for parameter optimization for both scientific and commercial HRS campaigns.

1. Introduction

Hyperspectral remote sensing (HRS) paved the way for various innovative applications in Earth observation (EO) fields, including but not limited to agriculture, geology, water resources, and land cover [1]. This review includes studies from spaceborne missions such as the decommissioned Hyperion (EO-1 platform) and TianGong-1, and the ongoing PRISMA (PRecursore IperSpettrale della Missione Applicativa), EnMAP (Environmental Mapping and Analysis Program), and HISUI (Hyperspectral Imager SUIte). Moreover, compared to satellite missions, airborne HRS missions have provided relatively higher spatial resolution since their introduction in the late 1980s and have delivered promising results across a wide range of applications [1,2]. These airborne missions include Hyperspectral Mapper (HyMAP), chim (CASI), and Airborne Visible/InfraRed Imaging Spectrometer (AVIRIS). For additional information on HRS satellite and airborne missions and applications, see [1,2,3,4,5].
Recent advancements in hyperspectral imaging (HSI) technologies facilitated the acquisition of higher-spatial-resolution data from spaceborne platforms that may eventually compete with airborne missions. Commercial companies such as Pixxel [6], Wyvern [7], and Orbital Sidekick [8] promise satellite-based hyperspectral imagery with spatial resolutions as high as 5 m. Correspondingly, there has been an increase in the availability and accessibility of public and commercial datasets. In recent years, the rapid development of UAV (Uncrewed Aerial Vehicle) systems and the miniaturization of HSI sensors enabled this type of platform to enter the HRS scene, offering a lower-cost alternative for acquiring higher-resolution imagery. Despite limited spatial coverage, the popularity of UAV-based HRS missions for specialized applications has increased [9,10]. Adão et al. [11] outline a range of hyperspectral sensors that are compatible with UAV platforms and associated applications. Zhong et al. [12] list common fixed-wing, helicopter multirotor, and hybrid UAVs considered for hyperspectral missions and corresponding specifications. These aforementioned developments, along with the increasing trend in the number of publications on hyperspectral remote sensing in general [13,14] or in specialized applications such as agriculture [15] over the past decade, indicate exciting times ahead for the future of HRS.
The growing variety of HRS platforms and sensors adds more complexity to decision-making and mission planning. To achieve products that meet certain scientific or industrial standards, a comprehensive assessment of target resolvability and mission feasibility is critical during the planning process, particularly given the vast array of data sources and configuration options. As HRS systems and technologies continue to evolve and more missions are planned, a deeper understanding of how different system parameters influence overall performance is essential [16]. Given the complex relationship between the inputs and outputs of HRS systems, along with budgeting constraints, a simulation approach is often preferred to support the detailed system design, optimization, and evaluation of mission parameters [17].
An HRS mission requires a realistic assessment of target resolvability, from analyses through the acquisition of high-fidelity hyperspectral images for various targets and background materials, which are often impractical due to time and cost constraints [18]. Given the cost associated with acquiring high-resolution hyperspectral imagery, it is often unfeasible to conduct flight tests to evaluate multiple configurations and determine the optimal parameters for specific applications [19,20]. These constraints amplify the necessity of simulation tools for designing new EO imaging systems. Forward simulation approaches, which model the imaging system of a new sensor and generate sensor-like output data, are instrumental in testing and refining sensor designs before deployment [21]. The ability to simulate a remote sensing system’s process allows for the adaptation and optimization of sensor and observation conditions [17] and helps users to pick the parameters that provide the most cost-effective configuration. By constructing and validating models, users can reinforce their understanding of the complexities of HSI systems and processes [22]. This capability is invaluable for both theoretical research and practical applications, ensuring that hyperspectral imaging systems are designed and operated within their optimal performance metrics.
Over the years, various hyperspectral remote sensing simulators, designed based on either analytical approaches or physics-based modelling schemes, have been developed with various primary objectives. The Forecasting and Analysis of Spectroradiometric System Performance (FASSP) is a statistical approach to modelling surface spectral reflectances that are propagated through an end-to-end remote sensing system [23]. FASSP identifies the components of a spectral imaging system analytical model, including scene, sensor, and processing models. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) is an extensive physics-based image and data simulator designed to generate synthetic multi-hyperspectral imagery across the visible to thermal infrared spectrum [24,25]. The Software Environment for the Simulation of Optical Remote sensing systems (SENSOR) consists of three parts that describe the geometrical relationship between the Sun, target, and remote sensing system, the radiometry, and the optical and electrical sensor model [17]. Some simulators are also designed specifically to support the development of a satellite mission, such as the EnMAP End-to-End Simulation Tool (EeteS) [21,26]. These simulators are intended for EO, but their architectural framework can also be used to simulate hyperspectral imagery for other planets, as demonstrated by [22] through their simulator for the Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) instrument. There are more hyperspectral simulators than this short introduction is able to cover; however, Zahidi et al. [18] provide a comprehensive overview of some hyperspectral simulators, including the Cranfield Hyperspectral Image Modelling and Evaluation System (CHIMES).
Some of these simulators are either commercialized or include trademarked components, while a few others were developed specifically in support of a publicly funded HRS mission. The rest, due to numerous reasons, are not readily accessible to the remote sensing community. While publicly funded spaceborne and airborne missions are usually able to include a specialized simulator in their planning phase, UAV missions are rarely equipped with such toolkits, and survey sensitivity analysis is often ignored. For these reasons, it is difficult for the research community or small to medium-sized enterprises to utilize the benefits of an HRS simulator. With the variety of datasets from public missions, the diverse options of platforms and sensors previously outlined, and the inaccessibility of the simulators, there is a need for a fully open-source, modular, and sensor- and platform-independent HRS forward modelling toolkit. This simulator should be designed to evaluate the optimum performance metrics of an HRS mission across a variety of target settings, atmospheric conditions, and sensor characteristics. In this study, we aim to establish a starting framework for such a simulator that integrates a primary processing workflow with various non-proprietary modules or libraries, called HySIMU. This framework is established to start and provide easy access to an HRS simulator that produces reliable synthetic images for the remote sensing community, where users can simulate various mission scenarios and test system performance against their specific parameters. The open-source nature of the toolkit hopefully motivates users in the community to test it for various applications, as well as encourages improvements and additional modules. HySIMU is designed to be independent of target scale, platform or camera. Note that this manuscript is intended to introduce the toolkit rather than study specific applications or conduct sensitivity studies. Applications of the previous versions of the toolkit can be found in [27,28,29]. The manuscript is structured as follows: Section 1 outlines the motivation and objective, Section 2 describes HySIMU’s framework and modules, Section 3 discusses sample results and validation studies, and Section 4 summarizes the toolkit and its advantages.

2. Methods: HySIMU Framework

We developed the HySIMU framework following the general architecture provided by FASSP [23] with a focus on the scene and sensor models, the forward modelling flow by SENSOR [17], and the modules of EeteS [21,26]. HySIMU simulates at-sensor radiance datacubes from synthetic or realistic surface reflectance maps with the assumption that the image is geometrically calibrated. It is designed to be sensor- and platform-independent, and with its modularity, the end user will be able to incorporate a sensor with preferred specifications according to their requirements. However, it should be noted that simulations should focus on the required spectral and spatial parameters to avoid excessive computational burdens.
The simulator is written in Python v3.11.5, integrates open-source libraries, and supports parallelization on High-Performance Computing (HPC) clusters. It is not categorized as an end-to-end simulation system because it does not include sensor calibration and electronic noise modules. However, the modular nature of the simulator allows for any additional module to be added by users.
Figure 1 shows the general workflow diagram of HySIMU, which is composed of several modules. The main modules of HySIMU are: (i) ground truth builder, (ii) datacube texturing, (iii) solar and view geometry computation, (iv) radiative transfer modelling, and (v) spatial resampling. These Python modules are executed sequentially through the main HySIMU script that takes and distributes the inputs and passes them into each corresponding module. Outputs from a module are automatically passed into the subsequent module. Future updates and community integrations can be added to the HySIMU GitHub repository as standalone modules or scripts imported into the main script.

2.1. Ground Truth Builder

HySIMU requires an initial map that represents regions of material distribution or “spectral zones”. This map serves as the “ground truth” from which a synthetic surface reflectance datacube will be derived. While constructing a surface reflectance scene can be complex, HySIMU applies a general implementation, with an approach to bridging the deterministic-geometrical and the stochastic-statistical approaches together [16,30]. This module within the HySIMU framework follows, with some adjustments, the “frequency synthesis of landscapes” algorithm [31] to randomly generate fractal fields. The process is as follows:
i.
A set of Discrete Fourier Transform (DFT) frequencies using a Fast Fourier Transform (FFT) algorithm is created for a user-specified grid.
ii.
A power spectrum synthesis is then computed from these frequencies according to the power law Equation (1) [32]:
P = k 2 H + 1 ,
where P is the power spectrum coefficient, k is the FFT frequency, and H is the Hurst exponent. The configurable Hurst exponent is introduced to control the overall “roughness” or complexity of the map, with lower values representing more complex clustering. Random noise (within the modelled frequency range) is added to the coefficients.
iii.
An Inverse Discrete Fourier Transform (IDFT) is performed to transform the coefficients that have been generated from the frequency grid and scaled with the noise into the spatial domain, yielding a representation of a random two-dimensional (2D) fractal field.
iv.
The fractal field is subsequently discretized based on pixel values and indexed into distinct regions to represent the spectral zones. Each index is associated with the corresponding material spectra from the input.
If preferred, the generated fractal fields can also serve as an input Digital Elevation Model (DEM) to delineate the intrinsic spatial relationships between topography and material distribution, as in the case of geological surface targets in mineral exploration. This ground truth-building approach, with a simple statistical parameter to represent the complexity, allows for better control in sensitivity studies, enabling users to systematically test and adjust the response of their systems to various scenarios [22]. Samples of spectral zone maps produced by HySIMU are shown in Figure 2. Alternatively, the ground truth can also be imported as a user input or generated by discretizing an input DEM. The ground truth scene must be of a higher or identical spatial resolution than the desired sensor spatial resolution.
This module also contains a function to initialize texture augmentation of the ground truth, to simulate the presence of spatial variations or heterogeneity within each of the spectral zones. Within each zone, spatially correlated noise is added using a clustered Gaussian random field generated using the GSTools library [33]. The Gaussian distribution is based on a study which demonstrated, from a set of real-world scenes, that the higher-frequency spatial components of hyperspectral imagery could be described by Gaussian mixture models [34]. A covariance nugget is provided as a customizable parameter to vary the degree of correlation. These clusters are populated with synthetic spectra of the same endmember material, generated by the following module, to eventually add texture to the surface reflectance datacube.

2.2. Datacube Texturing

The ground truth scene produced by the previous module is populated with spectral reflectance data from user input or randomly selected samples, categorized by class types, from the ECOSTRESS library [35] integrated into the Spectral Python package [36]. The input reflectance data must be spectrally oversampled in order to yield an accurate resampling to the sensor spectral resolution. In creating synthetic imagery for use in multi-/hyperspectral remote sensing, it is important to realistically reproduce the presence of texture arising from variations within a specific land cover class [37]. HySIMU provides a module to synthesize new spectral reflectance data from the input, following a method described in [37]. The method utilizes a set of reflectance spectra from the same class to derive a mean spectrum and a covariance matrix. Random samples are drawn from a multivariate normal (MV-N) distribution calculated from the covariance matrix with the mean spectrum as the base. Manolakis et al. [38] showed that an MV-N distribution can categorize HSI data accurately.
Figure 3 shows a set of sample results from this HySIMU module. Figure 3a shows the reflectance spectra of five arbitrary Alunite samples taken from the USGS spectral library [39]. The black spectrum (present in all plots) is the mean computed from the five samples. Using the mean and the covariance matrix, 10 new spectra are synthesized, as shown in Figure 3b. While appropriate statistics for each specific class derived from real imagery are ideal [16] for a simulation study, these statistics are often not easily obtainable, and only one endmember spectrum is available. To manage this, HySIMU also implements a simplified approach. Instead of calculating a mean spectrum, the input spectra serve as the base, and a number of intermediate spectra are computed using a specified standard deviation. The same steps as previously described then follow. Synthetic spectra produced using this approach are shown in Figure 3c. While the synthesized spectra are more consistent with the base spectrum compared to Figure 3b, they can be a sufficient representation of spectral texture for the simulated datacube and provide an optional parameter for users to calibrate.
The synthetic spectra, representing spectral variability, are created from the input endmembers and are then allocated to the ground truth model. The ground truth model is based on the previously determined spectral regions blended with spatially correlated Gaussian noise produced in the previous module. Figure 4 shows two false-colour images of a synthetic surface reflectance datacube populated with six arbitrary mineral spectra from the USGS library. Figure 4a shows a non-textured or smooth datacube, while Figure 4b shows a textured datacube, where each spectral region is divided further into twenty-five spatially correlated subregions.

2.3. Solar and View Geometry Calculations

Another source of texture in hyperspectral imagery comes from variations in the solar angle of incidence due to topography [37]. Using the generated DEM or an input DEM (which must be of the same resolution as the ground truth), HySIMU corrects the angle of incidence from the sun zenith angle for each pixel, using the haversine library [40] to correct the coordinates based on the provided reference coordinates, the pvlib library [41,42] to determine the solar position (azimuth and elevation) at the specified simulation location and time, and the insolation library [43] to determine the slope aspects, slope gradients, and solar vectors. The corrected angle is calculated based on the formula used to compute the angle of incidence on an inclined surface [44]:
cos θ = cos θ z cos β + sin θ z sin β cos γ s γ ,
where θ is the angle of incidence, θ z is the solar zenith angle, β is the slope angle, γ s is the solar azimuth angle, and γ is the surface azimuth angle. The corrected solar and view angle serve as input for the radiative transfer model in the following module. Figure 5 shows the effects of solar geometry correction on the final product of HySIMU. Without the correction, details related to terrain shadowing and terrain effects are not modelled, resulting in a less realistic image. This can significantly affect interpretation and classification in variable terrain scenarios.

2.4. Radiative Transfer Modelling

Using Lambertian surfaces and previously computed solar and view geometry models, atmospheric effects are applied to the generated surface reflectance datacube, pixel-by-pixel, using an integrated Radiative Transfer Model (RTM) to generate the at-sensor radiance datacube. The HySIMU workflow incorporates two open-source RTMs: 6S (the Simulation of a Satellite Signal in the Solar Spectrum) [45] and libRadtran (the library for Radiative transfer) [46,47]. Both RTMs have been validated in previous studies over the years (see [48,49,50] for 6S and [51,52] for libRadtran), and both perform better than other RTMs for specific applications [53]. The option to pick and choose the RTM provides more flexibility to the users and depends on their applications/goals. It should be noted that both RTMs require a different input parameter setup.
A vector version of the original 6S code (6SV1.1) RTM is implemented in the workflow through a Python wrapper called py6S [54]. This version can simulate the atmospheric radiative transfer of polarised and non-polarised visible and infrared radiation under different atmospheric conditions. HySIMU also implements libRadtran through a Python wrapper called pyLRT [55]. The model supports various solvers, with DISORT (Discrete Ordinates Radiative Transfer) being the one integrated into HySIMU, which allows high-resolution computation of radiance in the VIS-NIR-SWIR (visible light, near-infrared, and shortwave infrared) range. In addition to standard atmospheric parameters, HySIMU allows users to customize the gaseous properties of the atmosphere, such as H2O, CO2, and CH4 concentrations, as supported by libRadtran.
Both RTMs are suitable for any kind of remote sensing platform (e.g., UAVs, aircraft, satellites). While the Lambertian surface assumption can be considered a simplified approach, both RTMs provide options to use Bidirectional Reflectance Distribution Functions (BRDFs) that are available within their respective modules, but these are not yet included in HySIMU as of now.
Additionally, HySIMU provides a module to compute radiance without incorporating an RTM, following the inverse of the reflectance formula [56]:
ρ λ =   π L λ d 2 E S U N λ cos θ z   ,
where ρ λ is the Top-Of-Atmosphere (TOA) reflectance (unitless), L λ is the at-sensor spectral radiance (W/m2 sr μm), d is the Earth-Sun distance in astronomical units, E S U N λ is the exoatmospheric solar irradiance (W/(m2 μm)), and θ z is the solar zenith angle. For this function, solar irradiance values are sourced from the global reference spectrum of the ASTM G173-03 Standard Spectrum [57] dataset included in the pvlib library. Sample images generated from all three radiance computation options available in HySIMU are shown in Figure 6a, and the corresponding radiance spectra from an arbitrary pixel are shown in Figure 6b. Pixel-based Spectral Angle Mapper (SAM) [58] analysis performed on the radiance spectra for all pairs of images produced an overall mean score below 6% (after normalization to 1 radian): libRadtran-6S at 3.245% (0.102 rad), libRadtran-noRTM at 5.455% (0.171 rad), and 6S-noRTM at 3.955% (0.124 rad), indicating a high degree of similarity. Nevertheless, these values depend on the land cover of the scene as well as the atmospheric conditions during the acquisition. These examples show RTM-dependent radiance differences, larger differences in the shorter wavelengths, and smaller for the longer wavelengths. The option of using different RTMs in HySIMU thus allows for an uncertainty estimation due to varying model performance.

2.5. Spatial Resampling

Following the incorporation of atmospheric effects, the full-resolution at-sensor datacube is resampled to the desired sensor’s spatial resolution. Typically, HSI imagery is spatially downsampled using spatial resampling algorithms such as pixel aggregate, nearest neighbour, bilinear, and cubic convolution or using more complex algorithms specialized for data fusion [20]. Although these resampling approaches can downsample imagery to any desired pixel size, the imagery captured by the sensor does not necessarily represent the scene accurately, with small feature details blurred relative to larger features [59]. This effect is characterized by the sensor’s net Point Spread Function (PSF). Consequently, when only using a spatial downsampling technique as mentioned above, these methods fail to account for the characteristics of the simulated sensor. Rather, spatial resampling should be performed using a weighted average of the PSF of the simulated sensor [20]. This approach can be found in [60], where a convolution of the synthetic HSI scene with the sensor PSF was implemented to preserve the sensor’s characteristics.
A sensor’s PSF (PSFnet) is composed of four components [59]:
i.
An optical PSF introduced by the energy distribution spread in the sensor’s focal plane.
ii.
An image motion PSF caused by a shift during the motion of the sensor during the integration time.
iii.
A detector PSF that represents the non-zero spatial area of each detector in the sensor.
iv.
An electronic PSF introduced by the electronic filter of the sensor.
The total PSFnet is computed by convolving these four PSF components.
HySIMU provides an option to compute a simplified PSFnet, which is a combination of only the optical and the detector PSFs, to accommodate the possibility that users might not have detailed sensor and platform parameters. Although the optical PSF theoretically is represented by an Airy pattern, HySIMU implements an approximation using a Gaussian function [59], as follows:
P S F o p t x , y =   1 2 π a b e x 2 / a 2   e y 2 / b 2 ,
where x and y are positional variables, and a and b are the widths of the PSF in the cross-track and in-track directions, respectively. Moreover, the detector PSF is modelled using a rectangular function [59]:
P S F d e t x , y = r e c t x / w   r e c t y / w ,
where x and y are positional variables, and w is the width of the PSF. Thus,
s i m p l f i e d   P S F n e t = P S F o p t   P S F d e t ,
A significant portion of the signal received by each pixel originates from materials outside the pixel’s defined spatial extent [19]. The values of a Gaussian function decrease with distance, and at distances larger than three times the mean, the values become negligible. This implies that there is no advantage to using a Gaussian window larger than six times the mean [61]. Correspondingly, HySIMU uses a Gaussian kernel with a size equal to three times the spatial size of the sensor pixel as a default; however, it can be adjusted accordingly.
HySIMU also integrates the sensor-independent SR2 function developed by Inamdar et al. for pushbroom sensors [20]. The SR2 function combines optical PSF, image motion PSF, and detector PSF with several assumptions: (i) the PSFnet is wavelength-independent, (ii) the aircraft is flying at a constant speed in a perpendicular direction to the detector array, (iii) the spatial response is uniform over pixel spatial boundaries, and (iv) the data is georeferenced. While the function relies on these assumptions, it allows users to incorporate more detailed sensor parameters as well as platform motion parameters. It has been shown that the SR2 is effective in downsampling HSI data and that potential improvements can still be added [20].
The difference between images generated by each of the PSF functions is shown in Figure 7. The simplified PSF appears to produce a blurrier image when using the established kernel size. Further calibration of the kernel size is possible to fit users’ specifications and applications. After the PSFnet has been computed, the full-resolution at-sensor radiance/reflectance datacube that has been generated by HySIMU in the previous module is convoluted with the PSFnet to maintain the sensor characteristics. Subsequently, the convolved datacube is downsampled to the sensor’s resolution using a traditional downsampling technique with interpolation orders from 1 to 5.

3. Discussion: Validation and Potential Applications

An ideal validation scheme for HySIMU would require a hyperspectral image accompanied by a corresponding higher resolution ground truth map, endmember spectra covering a broader spectral range than the sensor, and statistical parameters for each spectral zone. However, such a complete dataset practically does not exist. This is the main challenge we encountered during the development of this toolkit. A linear simulator, such as HySIMU, is highly dependent on the input ground truth and spectra provided. In response to this, we set up a more straightforward validation scheme using an AVIRIS image, albeit with a limited spatial resolution. Here, we only focus on an aircraft image simulation, as previous works have already covered UAV and satellite applications [27,28,29].
This section presents a validation study using the Jasper Ridge hyperspectral image taken by the AVIRIS mission in 1992 (referred to as JR). A 100 × 100-pixel subset of the full reflectance image (Figure 8e) has been previously classified in previous works [62,63,64]. The abundance map and endmember spectra are shown in Figure 8a and Figure 8b, respectively. Since the abundance image was derived from the hyperspectral image, the spatial resolution is still not adequate to create an appropriate and realistic ground truth map that can be downsampled to the sensor resolution; thus, simplification approaches are needed, and careful consideration is required when interpreting the results.
From the abundance map, we derived two simplified ground truth maps (GT1 and GT2) using the following methods: (i) GT1 was created by assigning each pixel to the spectral class with the highest abundance; (ii) GT2 was created by upscaling the ground truth to 200 × 200 pixels using a nearest neighbour algorithm and introducing three additional spectral classes in pixels lacking a single dominant class (>75% abundance). The three additional classes are dominated by both Tree and Soil classes. These classes represent mixtures of both endmembers with varying ratios (75%/25%, 25%/75%, and 50%/50%) and add more complexity to the ground truth data, which are shown in Figure 8c (GT1) and Figure 8d (GT2). The ground truth maps and endmember spectra sets were input into the HySIMU workflow, with parameters based on the AVIRIS sensor and the ER-2 platform, as listed in Table 1. The DEM from the time of the survey was not available; hence, the simulations herein assume a flat DEM. Two synthetic hyperspectral reflectance images (REF1 and REF2) were generated using this setup, and are shown in Figure 8f,g. Computational performances of both REF1 and REF2 simulations are provided in Table 2. Note that these simulations did not include DEM-based solar view geometry computation, the SR2 filter, and radiance computation. Simulation runtimes are expected to increase if any of these functions are included.
The results show strong qualitative correlations between the Jasper Ridge reflectance and the simulated images. While the other three spectral classes exhibit colours similar to the JR image, the spectral zones of trees appear darker in both REF1 and REF2. This discrepancy may result from the shadowing effects of the vegetation canopy and the variability in vegetation types in the reflectance image, which could influence the derived Tree endmember spectrum during the unmixing process. Due to the limited resolution of the ground truth, the simulated images appear more pixelated than the true reflectance image. This is a result of reduced mixing across the pixels, in addition to the atmospheric effects on the simulated images.
Figure 9 shows scattergrams of reflectance values from a near-infrared (NIR) band and a Red band for all three images. It is important to point out that, due to different value calibrations, the scattergrams are not plotted on identical scales. Real images, due to various factors such as sensor noise, topographic shading, and subpixel mixing, often produce a diffuse scattergram [59], indicating substantial mixing. This effect can be observed in the JR (Figure 9a), with “clouds” of data points and no observable clustering. This observation is less prominent in both REF1 (Figure 9b) and REF2 (Figure 9c) scattergrams. REF1 displays more clustered data points, indicating weak mixing between the spectra. This is to be expected because REF1 is simulated directly from the abundance image. On this note, REF2 exhibits stronger mixing than REF1, because of a higher resolution ground truth and mixed spectra as inputs. Intrascene variability is expected to increase as the diversity of mixed spectra represented in the ground truth becomes greater (e.g., adding more mixed proportions such as 60% Tree and 40% Soil spectra). Outliers are also more apparent in the JR image, consistent with real-life characteristics of remote sensing images. These observations demonstrate that, despite its dependence on the spatial resolution of the ground truth, HySIMU can effectively mix the spectra, making it a reliable tool for simulating images from an input ground truth.
The JR reflectance image is atmospherically corrected while REF1 and REF2 are not; thus, a direct quantitative comparison to the simulated images is difficult. Correcting for atmospheric effects in REF1 and REF2 is counterproductive. Here, we opted for comparison metrics that do not rely on absolute values but rather on the monotonic relationships between image values using Spearman rank correlation (rs) [65] and on image structural information using Structural Similarity Index Measure (SSIM) [66]. The results for these metrics are shown in Table 3. rs values signify consistent trends between REF1 and REF2 relative to the JR, while mean SSIM values, albeit modest, still indicate that all images are positively correlated.
Furthermore, we performed Principal Component Analysis (PCA) on all three images to reduce dimensionality and simplify the comparison, allowing the majority of the information to be represented in the first few principal components [67]. This is to mitigate the data calibration and atmospheric correction differences. All principal component values are normalized before plotting. Figure 10 illustrates the scattergrams of the first principal component (PC1) of JR plotted against PC1 of REF1 (Figure 10a) and REF2 (Figure 10b). R2 values approaching 1 for both plots indicate that all first principal components have linear relationships; however, the steep slope and horizontal shearing of the clusters highlight more spectral variability on the x-axis, a feature consistent with more mixing contained in the JR image.
The design of two ground truth maps with differing complexities and resolutions highlights HySIMU’s ability to handle scenarios with varying levels of complexity and scale. It should be noted that having a realistic and high-resolution ground truth might not be necessary if the goal is only to test the response of various system parameters rather than performing a full classification scheme, considering that high-resolution ground truth will require longer computing time. This validation scheme demonstrates how HySIMU can be utilized to generate hyperspectral imagery from a set of realistic parameters and evaluate the effects of changing parameters for an HRS mission. In contrast, real atmospheric effects contain a high degree of uncertainty, particularly in the Blue bands (~450 nm) [68], and thus are difficult to model appropriately and often lead to underestimation. To mitigate the propagation of these errors throughout the processing chain, the use of alternative atmospheric correction algorithms besides 6S or libRadtran is recommended when inverse modelling HySIMU images.
An additional validation simulation (REF-A1) using the Urban dataset (UR) with ground truth and endmember spectra derived by [69], acquired with the Hyperspectral Digital Imagery Collection Experiment (HYDICE) instrument [70,71], is provided in the Supplementary File. Unlike the Jasper Ridge simulations, the Urban simulation does not include ground truths with different complexity levels. However, the analyses follow the same approach and order as those used for the Jasper Ridge simulations, and the results should therefore be interpreted in the same manner.
Previous iterations of HySIMU, which were specifically designed for UAV applications without integrated RTMs and implemented in MATLAB (R2020b) as a closed-source codebase, have demonstrated practical utility in mineral exploration [27] and soil sensing [28]. In addition, for satellite applications, Beaulne et al. [29] utilized the newer version of HySIMU for simulation studies to compare the performance of the PRISMA and PACE-OCI (Plankton, Aerosol, Cloud, ocean Ecosystem–Ocean Color Instrument) satellite systems for algal bloom monitoring. Using real algal spectra from the GLORIA (GLObal Reflectance community dataset for Imaging and optical sensing of Aquatic environments) dataset [72,73], this study concluded that while simulated PRISMA imagery offered improved finer-scale detection, their performance metrics were not consistently better. This shows the potential benefit of HySIMU by giving users a tool to evaluate the effectiveness of various satellite HRS products for their specific applications. HySIMU can serve as a complementary toolkit to various open-source hyperspectral analysis and processing tools, e.g., [74,75], contributing to the development of a comprehensive and transparent HSI workflow. Additionally, we envision several future application areas for HySIMU, including the generation of synthetic datasets to train machine-learning (ML) algorithms aimed at detecting atmospheric carbon from satellite imagery. In collaboration with an industry partner, we generated synthetic satellite hyperspectral images (using PRISMA satellite parameters) with varying levels of atmospheric methane in selected regions. These images were processed using libRadtran and provided to the partner’s in-house ML algorithm. The objective was to train the algorithm on synthetic data to enable its application to real satellite imagery. The results were promising, highlighting potential for further developments.

4. Conclusions

The increase in the availability of hyperspectral remote sensing sensors and platforms presents a need for an open-source simulator that is readily accessible to the remote sensing research community and small and medium-sized enterprises. In this study, HySIMU is presented as a framework for an open-source, modular, and sensor- and platform-independent HRS simulator toolkit that combines and integrates a primary processing pipeline with various open-source packages into a validated and modular workflow. HySIMU can simulate data for any spectral and spatial resolution and at any spatial and spectral scale. It allows users to have greater control over the parameters and thus provides a tool for sensitivity studies to determine the optimum survey parameters and conditions. The ability to produce reliable synthetic images can also be beneficial for training ML algorithms for various applications.
The main challenge in the process of developing HySIMU is the absence of high-resolution hyperspectral data and ground truth maps. This prevents the simulator from being used to its full extent and restricts its validation process. However, the performance of HySIMU was successfully validated, albeit more straightforwardly, by simulating the AVIRIS Jasper Ridge and HYDICE Urban reflectance images. The two simulations presented herein show that HySIMU can generate synthetic hyperspectral imagery at any scale and is applicable for sensitivity studies (for any single or multiple parameter choices) during mission planning.
It is important to note that real hyperspectral scenes often contain uncertainties from multiple sources, and an ideal simulator without simplifications is rather difficult to achieve. This toolkit implements several approximations; therefore, its effectiveness is determined by the input data and the specific context of the application. Furthermore, to categorize this toolkit as an end-to-end simulator, several other modules could be integrated in the upcoming versions, i.e., camera calibration and electronic noise modules. Additionally, while the randomly generated maps produced by HySIMU are more of a statistical representation than a real-life map, as more thematic or land classification remote sensing images become openly available, there is a potential to add an ML-driven, realistic map generator function in the future.
Despite these limitations, the integration of many previously developed community tools made HySIMU a flexible and fully customizable toolkit. Therefore, applications in other fields, such as vegetation mapping, agriculture, carbon detection, and environmental monitoring, are feasible. HySIMU is a starting framework that can provide both the remote sensing community and their users with a useful tool for specific applications, with a focus on mission design, target requirements, or sensor selection. HySIMU is developed with the community in mind, and users are encouraged to express ideas for improvements, add modules and algorithms, and apply the toolkit to new applications. The code is made available via GitHub under the GNU General Public License version 3 (GPLv3).

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/rs18060943/s1, Figure S1. Graphical representation of the HySIMU processing chain. All components and details correspond to those described in Figure 1. Blue text indicates the modules, red text indicates inputs, and green text indicates outputs. The images included are for visualization purposes only and do not represent actual simulation outputs. Table S1. HYDICE-based Urban (UR) hyperspectral image simulation parameters used in the validation study, modified from [70,71]. Table S2. Comparison metrics rs and mean SSIM between the Urban dataset (UR) and the HySIMU simulation REF-A1, after removal of spectral bands affected by dense water vapour and atmospheric effects (bands 1–4, 76, 87, 101–111, 136–153 and 198–210). Figure S2. (a) Spectral endmembers of the HYDICE 1995 Urban reflectance image (UR) classified by [69]. (b) The simplified ground truth map derived from the abundance map and the endmembers from the same study. (c) RGB image (307 × 307 pixels) of the Urban hyperspectral dataset with 2 m spatial resolution. (d) RGB image (307 × 307 pixels) of the at-sensor reflectance datacube REF-A1, simulated by HySIMU using (b) as the ground truth and the parameters listed in Table S1. Figure S3. NIR (near-infrared) versus Red scattergrams for (a) UR (HYDICE Urban) and (b) REF-A1. These scattergrams illustrate the degree of spectral mixing in the corresponding reflectance images, with UR exhibiting a stronger mixing. Figure S4. (a) Scattergrams of PC1 (the first principal component) values from UR (HYDICE Urban, x-axis) plotted against PC1 from REF-A1 (y-axis). (b) Scattergrams of PC2 (the second principal component) values from UR (x-axis) plotted against PC2 from REF-A1 (y-axis). All PC values are normalized for consistent scaling.

Author Contributions

Conceptualization, F.A. and A.B.; methodology, F.A.; software, F.A.; validation, F.A. and A.B.; formal analysis, F.A. and A.B.; investigation, F.A. and A.B.; resources, A.B.; data curation, F.A.; writing—original draft preparation, F.A.; writing—review and editing, F.A. and A.B.; visualization, F.A.; supervision, A.B.; project administration, A.B.; funding acquisition, A.B. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by NSERC (Natural Sciences and Engineering Research Council of Canada) through the CREATE (Collaborative Research and Training Experience) UTILI (Uninhabited aircraft systems Training, Innovation and Leadership Initiative) program and MITACS (IT42788) through an Accelerate internship with Metaspectral.

Data Availability Statement

The code is available on GitHub: https://github.com/fadhli-atarita/HySIMU/ (accessed on 1 March 2026).

Acknowledgments

The authors would like to thank the NSERC CREATE UTILI program for funding the initial development of HySIMU, MITACS (IT42788) and Metaspectral for the subsequent funding that has made it possible to improve HySIMU to the latest version, the Centre for Advance Computing (CAC) at Queen’s University for providing the computing resources needed for this project, and the anonymous reviewers for their comments that helped improve this manuscript. We also would like to acknowledge the many HRS experts who have developed tools and packages for hyperspectral forward modelling and simulations.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
6SSimulation of a Satellite Signal in the Solar Spectrum
AVIRISAirborne Visible/InfraRed Imaging Spectrometer
CASICompact Airborne Spectrographic Imager
CHIMESCranfield Hyperspectral Image Modelling and Evaluation System
CRISMCompact Reconnaissance Imaging Spectrometer for Mars
DEMDigital Elevation Model
DFTDiscrete Fourier Transform
DIRSIGDigital Imaging and Remote Sensing Image Generation
EnMAPEnvironmental Mapping and Analysis Program
EOEarth Observation
EeteSEnMAP End-to-End Simulation Tool
FFTFast Fourier Transform
FASSPForecasting and Analysis of Spectroradiometric System Performance
GLORIAGLObal Reflectance community dataset for Imaging and optical sensing of Aquatic environments
HPCHigh-Performance Computing
HRSHyperspectral Remote Sensing
HSIHyperspectral Imaging
HYDICEHyperspectral Digital Imagery Collection Experiment
HySIMUHyperspectral SIMUlator
IDFTInverse Discrete Fourier Transform
JRAVIRIS Jasper Ridge hyperspectral image
libRadtranlibrary for Radiative transfer
MaxRSSMaximum Resident Set Size
MV-Nmultivariate normal distribution
NIRNear-infrared
PACE-OCIPlankton, Aerosol, Cloud, ocean Ecosystem–Ocean Color Instrument
PC1the first principal component
PC2the second principal component
PRISMAPRecursore IperSpettrale della Missione Applicativa
PSFPoint Spread Function
PSTPacific Standard Time
RGBRed-Green-Blue
RTMRadiative Transfer Model
TOATop-Of-Atmosphere
UAVUncrewed Aerial Vehicle
URHYDICE Urban hyperspectral image
USGSUnited States Geological Survey

References

  1. Transon, J.; d’Andrimont, R.; Maugnard, A.; Defourny, P. Survey of Hyperspectral Earth Observation Applications from Space in the Sentinel-2 Context. Remote Sens. 2018, 10, 157. [Google Scholar] [CrossRef]
  2. Jia, J.; Wang, Y.; Chen, J.; Guo, R.; Shu, R.; Wang, J. Status and Application of Advanced Airborne Hyperspectral Imaging Technology: A Review. Infrared Phys. Technol. 2020, 104, 103115. [Google Scholar] [CrossRef]
  3. Qian, S.-E. Hyperspectral Satellites, Evolution, and Development History. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 7032–7056. [Google Scholar] [CrossRef]
  4. Qian, S. Overview of Hyperspectral Imaging Remote Sensing from Satellites. In Advances in Hyperspectral Image Processing Techniques; Chang, C.-I., Ed.; Wiley: Hoboken, NJ, USA, 2022; pp. 41–66. ISBN 978-1-119-68776-4. [Google Scholar]
  5. Bhargava, A.; Sachdeva, A.; Sharma, K.; Alsharif, M.H.; Uthansakul, P.; Uthansakul, M. Hyperspectral Imaging and Its Applications: A Review. Heliyon 2024, 10, e33208. [Google Scholar] [CrossRef] [PubMed]
  6. Pixxel Space Technologies Hyperspectral Imagery. Available online: https://www.pixxel.space/hyperspectral-imagery (accessed on 3 June 2025).
  7. Wyvern Inc. Hyperspectral Data Products. Available online: https://www.wyvern.space/product (accessed on 16 March 2026).
  8. Orbital Sidekick Technology. Available online: https://www.orbitalsidekick.com/technology (accessed on 1 March 2026).
  9. Zhang, Z.; Huang, L.; Wang, Q.; Jiang, L.; Qi, Y.; Wang, S.; Shen, T.; Tang, B.-H.; Gu, Y. UAV Hyperspectral Remote Sensing Image Classification: A Systematic Review. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2025, 18, 3099–3124. [Google Scholar] [CrossRef]
  10. Yao, H.; Qin, R.; Chen, X. Unmanned Aerial Vehicle for Remote Sensing Applications—A Review. Remote Sens. 2019, 11, 1443. [Google Scholar] [CrossRef]
  11. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J. Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef]
  12. Zhong, Y.; Wang, X.; Xu, Y.; Wang, S.; Jia, T.; Hu, X.; Zhao, J.; Wei, L.; Zhang, L. Mini-UAV-Borne Hyperspectral Remote Sensing: From Observation and Processing to Applications. IEEE Geosci. Remote Sens. Mag. 2018, 6, 46–62. [Google Scholar] [CrossRef]
  13. Engineering Village. Search Results for “Hyperspectral Remote Sensing”. Elsevier. Available online: https://www.engineeringvillage.com (accessed on 3 June 2025).
  14. Lv, Z.; Zhang, M.; Sun, W.; Lei, T.; Benediktsson, J.A.; Liu, T. Land Cover Change Detection with Hyperspectral Remote Sensing Images: A Survey. Inf. Fusion 2025, 123, 103257. [Google Scholar] [CrossRef]
  15. Lu, B.; Dao, P.; Liu, J.; He, Y.; Shang, J. Recent Advances of Hyperspectral Imaging Technology and Applications in Agriculture. Remote Sens. 2020, 12, 2659. [Google Scholar] [CrossRef]
  16. Kerekes, J.P.; Landgrebe, D.A. Simulation of Optical Remote Sensing Systems. IEEE Trans. Geosci. Remote Sens. 1989, 27, 762–771. [Google Scholar] [CrossRef]
  17. Börner, A.; Wiest, L.; Keller, P.; Reulke, R.; Richter, R.; Schaepman, M.; Schläpfer, D. SENSOR: A Tool for the Simulation of Hyperspectral Remote Sensing Systems. ISPRS J. Photogramm. Remote Sens. 2001, 55, 299–312. [Google Scholar] [CrossRef]
  18. Zahidi, U.A.; Yuen, P.W.T.; Piper, J.; Godfree, P.S. An End-to-End Hyperspectral Scene Simulator with Alternate Adjacency Effect Models and Its Comparison with CameoSim. Remote Sens. 2019, 12, 74. [Google Scholar] [CrossRef]
  19. Inamdar, D.; Kalacska, M.; Leblanc, G.; Arroyo-Mora, J.P. Characterizing and Mitigating Sensor Generated Spatial Correlations in Airborne Hyperspectral Imaging Data. Remote Sens. 2020, 12, 641. [Google Scholar] [CrossRef]
  20. Inamdar, D.; Kalacska, M.; Darko, P.O.; Arroyo-Mora, J.P.; Leblanc, G. Spatial Response Resampling (SR2): Accounting for the Spatial Point Spread Function in Hyperspectral Image Resampling. MethodsX 2023, 10, 101998. [Google Scholar] [CrossRef]
  21. Segl, K.; Guanter, L.; Rogass, C.; Kuester, T.; Roessner, S.; Kaufmann, H.; Sang, B.; Mogulsky, V.; Hofer, S. EeteS—The EnMAP End-to-End Simulation Tool. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 522–530. [Google Scholar] [CrossRef]
  22. Parente, M.; Clark, J.T.; Brown, A.J.; Bishop, J.L. End-to-End Simulation and Analytical Model of Remote-Sensing Systems: Application to CRISM. IEEE Trans. Geosci. Remote Sens. 2010, 48, 5491159. [Google Scholar] [CrossRef]
  23. Kerekes, J.P.; Baum, J.E. Full-Spectrum Spectral Imaging System Analytical Model. IEEE Trans. Geosci. Remote Sens. 2005, 43, 571–580. [Google Scholar] [CrossRef]
  24. Schott, J.R.; Brown, S.D.; Raqueño, R.V.; Gross, H.N.; Robinson, G. An Advanced Synthetic Image Generation Model and Its Application to Multi/Hyperspectral Algorithm Development. Can. J. Remote Sens. 1999, 25, 99–111. [Google Scholar] [CrossRef]
  25. Goodenough, A.A.; Brown, S.D. DIRSIG5: Next-Generation Remote Sensing Data and Image Simulation Framework. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 4818–4833. [Google Scholar] [CrossRef]
  26. Guanter, L.; Segl, K.; Kaufmann, H. Simulation of Optical Remote-Sensing Scenes With Application to the EnMAP Hyperspectral Mission. IEEE Trans. Geosci. Remote Sens. 2009, 47, 2340–2351. [Google Scholar] [CrossRef]
  27. Atarita, F.; Braun, A. Synthetic Hyperspectral Sensing Simulator: A Tool for Optimizing Applications in Mineral Exploration. In Proceedings of the SPIE Future Sensing Technologies 2021; Valenta, C.R., Shaw, J.A., Kimata, M., Eds.; SPIE: Bellingham, WA, USA, 2021; p. 35. [Google Scholar]
  28. Atarita, F.; Braun, A. HYSIMU: A Hyperspectral Simulator for Airborne Remote Sensing of Soils. In Proceedings of the the Application of Proximal and Remote Sensing Technologies for Soil Investigations, Virtual, 16–19 August 2021. [Google Scholar]
  29. Beaulne, D.; Atarita, F.; Fotopoulos, G.; Braun, A. Simulating At-Sensor Hyperspectral Satellite Data for Inland Water Algal Blooms. Sci. Total Environ. 2025, 1000, 180313. [Google Scholar] [CrossRef]
  30. Strahler, A.H.; Woodcock, C.E.; Smith, J.A. On the Nature of Models in Remote Sensing. Remote Sens. Environ. 1986, 20, 121–139. [Google Scholar] [CrossRef]
  31. Bourke, P. Frequency Synthesis of Landscapes (and Clouds). Available online: https://paulbourke.net/fractals/noise/ (accessed on 3 June 2025).
  32. Wang, Y.; Azam, A.; Wilson, M.C.; Neville, A.; Morina, A. Generating Fractal Rough Surfaces with the Spectral Representation Method. In Proceedings of the Institution of Mechanical Engineers, Part. J: Journal of Engineering Tribology; SAGE Publications: London, UK, 2021. [Google Scholar] [CrossRef]
  33. Müller, S.; Schüler, L.; Zech, A.; Heße, F. GSTools v1.3: A Toolbox for Geostatistical Modelling in Python. Geosci. Model Dev. 2022, 15, 3161–3182. [Google Scholar] [CrossRef]
  34. Chakrabarti, A.; Zickler, T. Statistics of Real-World Hyperspectral Images. In Proceedings of the CVPR 2011; IEEE: Colorado Springs, CO, USA, 2011; pp. 193–200. [Google Scholar]
  35. Meerdink, S.K.; Hook, S.J.; Roberts, D.A.; Abbott, E.A. The ECOSTRESS Spectral Library Version 1.0. Remote Sens. Environ. 2019, 230, 111196. [Google Scholar] [CrossRef]
  36. Boggs, T. Spectral Python [Python Package]. Available online: http://spectralpython.net (accessed on 16 March 2026).
  37. Schott, J.R.; Salvaggio, C.; Brown, S.D.; Rose, R.A. Incorporation of Texture in Multispectral Synthetic Image Generation Tools; Watkins, W.R., Clement, D., Eds.; SPIE: Bellingham, WA, USA, 1995; pp. 189–196. [Google Scholar]
  38. Manolakis, D.G.; Marden, D.; Kerekes, J.P.; Shaw, G.A. Statistics of Hyperspectral Imaging Data; Shen, S.S., Descour, M.R., Eds.; SPIE: Bellingham, WA, USA, 2001; pp. 308–316. [Google Scholar]
  39. Kokaly, R.F.; Clark, R.N.; Swayze, G.A.; Livo, K.E.; Hoefen, T.M.; Pearson, N.C.; Wise, R.A.; Benzel, W.; Lowers, H.A.; Driscoll, R.L.; et al. USGS Spectral Library Version 7; Data Series; U.S. Geological Survey: Reston, VA, USA, 2017; p. 68.
  40. Rouberol, B. Haversine [Python Package]. Available online: https://github.com/mapado/haversine (accessed on 3 June 2025).
  41. Anderson, K.S.; Hansen, C.W.; Holmgren, W.F.; Jensen, A.R.; Mikofski, M.A.; Driesse, A. Pvlib Python: 2023 Project Update. J. Open Source Softw. 2023, 8, 5994. [Google Scholar] [CrossRef]
  42. Holmgren, W.F.; Hansen, C.W.; Mikofski, M.A. Pvlib Python: A Python Package for Modeling Solar Energy Systems. J. Open Source Softw. 2018, 3, 884. [Google Scholar] [CrossRef]
  43. Corripio, J.G. Insolation [Python Package]. Available online: https://www.meteoexploration.com/insol/python/index.html (accessed on 16 March 2026).
  44. Duffie, J.A.; Beckman, W.A. Solar Engineering of Thermal Processes, 1st ed.; Wiley: Hoboken, NJ, USA, 2013; ISBN 978-0-470-87366-3. [Google Scholar]
  45. Vermote, E.F.; Tanre, D.; Deuze, J.L.; Herman, M.; Morcette, J.-J. Second Simulation of the Satellite Signal in the Solar Spectrum, 6S: An Overview. IEEE Trans. Geosci. Remote Sens. 1997, 35, 675–686. [Google Scholar] [CrossRef]
  46. Emde, C.; Buras-Schnell, R.; Kylling, A.; Mayer, B.; Gasteiger, J.; Hamann, U.; Kylling, J.; Richter, B.; Pause, C.; Dowling, T.; et al. The libRadtran Software Package for Radiative Transfer Calculations (Version 2.0.1). Geosci. Model Dev. 2016, 9, 1647–1672. [Google Scholar] [CrossRef]
  47. Mayer, B.; Kylling, A. Technical Note: The libRadtran Software Package for Radiative Transfer Calculations-Description and Examples of Use. Atmos. Chem. Phys. 2005, 5, 1855–1877. [Google Scholar] [CrossRef]
  48. Kotchenova, S.Y.; Vermote, E.F. Validation of a Vector Version of the 6S Radiative Transfer Code for Atmospheric Correction of Satellite Data. Part II. Homogeneous Lambertian and Anisotropic Surfaces. Appl. Opt. 2007, 46, 4455–4464. [Google Scholar] [CrossRef]
  49. Kotchenova, S.Y.; Vermote, E.F.; Matarrese, R.; Frank, J.; Klemm, J. Validation of a Vector Version of the 6S Radiative Transfer Code for Atmospheric Correction of Satellite Data. Part I: Path Radiance. Appl. Opt. 2006, 45, 6762–6774. [Google Scholar] [CrossRef]
  50. Kotchenova, S.Y.; Vermote, E.F.; Levy, R.; Lyapustin, A. Radiative Transfer Codes for Atmospheric Correction and Aerosol Retrieval: Intercomparison Study. Appl. Opt. 2008, 47, 2215–2226. [Google Scholar] [CrossRef]
  51. Evans, C. Statistical Comparison Between Various Atmospheric Correction Statistical Comparison Between Various Atmospheric Correction Methods and the LibRadtran Package Methods and the LibRadtran Package. Int. J. Remote Sens. 2002, 23, 2651–2671. [Google Scholar] [CrossRef]
  52. Obregón, M.A.; Serrano, A.; Costa, M.J.; Silva, A.M. Validation of libRadtran and SBDART Models under Different Aerosol Conditions. IOP Conf. Ser. Earth Environ. Sci. 2015, 28, 012010. [Google Scholar] [CrossRef]
  53. Govaerts, Y.; Nollet, Y.; Leroy, V. Radiative Transfer Model Comparison with Satellite Observations over CEOS Calibration Site Libya-4. Atmosphere 2022, 13, 1759. [Google Scholar] [CrossRef]
  54. Wilson, R.T. Py6S: A Python Interface to the 6S Radiative Transfer Model. Comput. Geosci. 2013, 51, 166–171. [Google Scholar] [CrossRef]
  55. Gryspeerdt, E. pyLRT [Python Package]. Available online: https://github.com/EdGrrr/pyLRT (accessed on 11 February 2025).
  56. Chander, G.; Markham, B.L.; Helder, D.L. Summary of Current Radiometric Calibration Coefficients for Landsat MSS, TM, ETM+, and EO-1 ALI Sensors. Remote Sens. Environ. 2009, 113, 893–903. [Google Scholar] [CrossRef]
  57. ASTM G173-03(2020); Tables for Reference Solar Spectral Irradiances: Direct Normal and Hemispherical on 37 Tilted Surface. G03 Committee ASTM International: West Conshohocken, PA, USA, 2008. [CrossRef]
  58. Kruse, F.A.; Lefkoff, A.B.; Boardman, J.W.; Heidebrecht, K.B.; Shapiro, A.T.; Barloon, P.J.; Goetz, A.F.H. The Spectral Image Processing System (SIPS)—Interactive Visualization and Analysis of Imaging Spectrometer Data. Remote Sens. Environ. 1993, 44, 145–163. [Google Scholar] [CrossRef]
  59. Schowengerdt, R.A. Remote Sensing, Models, and Methods for Image Processing, 3rd ed.; Academic Press: Burlington, MA, USA, 2007; ISBN 978-0-12-369407-2. [Google Scholar]
  60. Blonski, S.; Cao, C.; Gasser, J.; Ryan, R.; Zanoni, V.; Stanley, T. Satellite Hyperspectral Imaging Simulation. In Proceedings of the Proceedings of the International Symposium on Spectral Sensing Research (ISSSR) 1999; US Corps of Engineers: Washington, DC, USA, 2000. [Google Scholar]
  61. Gonzalez, R.C.; Woods, R.E. Digital Image Processing, 4th ed.; Global Edition; Pearson: New York, NY, USA, 2017; ISBN 978-0-13-335672-4. [Google Scholar]
  62. Zhu, F.; Wang, Y.; Fan, B.; Meng, G.; Pan, C. Effective Spectral Unmixing via Robust Representation and Learning-Based Sparsity. arXiv 2014. [Google Scholar] [CrossRef]
  63. Zhu, F.; Wang, Y.; Fan, B.; Meng, G.; Xiang, S.; Pan, C. Spectral Unmixing via Data-Guided Sparsity. arXiv 2014. [Google Scholar] [CrossRef] [PubMed]
  64. Zhu, F.; Wang, Y.; Xiang, S.; Fan, B.; Pan, C. Structured Sparse Method for Hyperspectral Unmixing. ISPRS J. Photogramm. Remote Sens. 2014, 88, 101–118. [Google Scholar] [CrossRef]
  65. Schober, P.; Boer, C.; Schwarte, L.A. Correlation Coefficients: Appropriate Use and Interpretation. Anesth. Analg. 2018, 126, 1763–1768. [Google Scholar] [CrossRef] [PubMed]
  66. Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image Quality Assessment: From Error Visibility to Structural Similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [PubMed]
  67. Rodarmel, C.; Shan, J. Principal Component Analysis for Hyperspectral Image Classification. Surv. Land Inf. Sci. 2002, 62, 115–122. [Google Scholar]
  68. Pahlevan, N.; Mangin, A.; Balasubramanian, S.V.; Smith, B.; Alikas, K.; Arai, K.; Barbosa, C.; Bélanger, S.; Binding, C.; Bresciani, M.; et al. ACIX-Aqua: A Global Assessment of Atmospheric Correction Methods for Landsat-8 and Sentinel-2 over Lakes, Rivers, and Coastal Waters. Remote Sens. Environ. 2021, 258, 112366. [Google Scholar] [CrossRef]
  69. Zhu, F. Hyperspectral Unmixing: Ground Truth Labeling, Datasets, Benchmark Performances and Survey. arXiv 2017. [Google Scholar] [CrossRef]
  70. Ford, S.J.; Kalp, D.; McGlone, J.C.; McKeown, D.M., Jr. Preliminary Results on the Analysis of HYDICE Data for Information Fusion in Cartographic Feature Extraction. In Proceedings of the Integrating Photogrammetric Techniques with Scene Analysis and Machine Vision III; McKeown, D.M., Jr., McGlone, J.C., Jamet, O., Eds.; SPIE: Bellingham, WA, USA, 1997; Volume 3072, pp. 67–86. [Google Scholar]
  71. Mitchell, P.A. Hyperspectral Digital Imagery Collection Experiment (HYDICE). In Proceedings of the Geographic Information Systems, Photogrammetry, and Geological/Geophysical Remote Sensing; Lurie, J.B., Pearson, J.J., Zilioli, E., Eds.; SPIE: Bellingham, WA, USA, 1995; Volume 2587, pp. 70–95. [Google Scholar]
  72. Lehmann, M.K.; Gurlin, D.; Pahlevan, N.; Alikas, K.; Anstee, J.M.; Balasubramanian, S.V.; Barbosa, C.C.F.; Binding, C.; Bracher, A.; Bresciani, M.; et al. GLORIA-A Global Dataset of Remote Sensing Reflectance and Water Quality from Inland and Coastal Waters; Pangaea: Bremen, Germany, 2022; p. 72. [Google Scholar]
  73. Lehmann, M.K.; Gurlin, D.; Pahlevan, N.; Alikas, K.; Conroy, T.; Anstee, J.; Balasubramanian, S.V.; Barbosa, C.C.F.; Binding, C.; Bracher, A.; et al. GLORIA-A Globally Representative Hyperspectral in Situ Dataset for Optical Sensing of Water Quality. Sci. Data 2023, 10, 100. [Google Scholar] [CrossRef]
  74. Hanson, N.; Manke, P.; Birkholz, S.; Mühlbauer, M.; Heine, R.; Brandes, A. Cuvis.Ai: An Open-Source, Low-Code Software Ecosystem for Hyperspectral Processing and Classification. In 2024 14th Workshop on Hyperspectral Imaging and Signal Processing: Evolution in Remote Sensing (WHISPERS); IEEE: Piscataway, NJ, USA, 2024. [Google Scholar]
  75. Mao, Y.; Betters, C.H.; Evans, B.; Artlett, C.P.; Leon-Saval, S.G.; Garske, S.; Cairns, I.H.; Cocks, T.; Winter, R.; Dell, T. OpenHSI: A Complete Open-Source Hyperspectral Imaging Solution for Everyone. Remote Sens. 2022, 14, 2244. [Google Scholar] [CrossRef]
Figure 1. HySIMU processing chain flowchart. The asterisk symbol indicates the option of having either a randomized or a custom DEM. These modules are connected by a main script that takes the inputs and passes the outputs to each module linearly. A graphical representation of the workflow is provided in the Supplementary File.
Figure 1. HySIMU processing chain flowchart. The asterisk symbol indicates the option of having either a randomized or a custom DEM. These modules are connected by a main script that takes the inputs and passes the outputs to each module linearly. A graphical representation of the workflow is provided in the Supplementary File.
Remotesensing 18 00943 g001
Figure 2. Samples of spectral zone maps (400 × 600 pixels) randomly generated by a function included in HySIMU’s “Ground Truth Builder” module. Both maps show varying levels of complexity, generated using a Hurst exponent of 1 for (a) and 1.5 for (b). Both maps are discretized into five undefined spectral zones, differentiated by colours. The spectra for each zone can be assigned as desired.
Figure 2. Samples of spectral zone maps (400 × 600 pixels) randomly generated by a function included in HySIMU’s “Ground Truth Builder” module. Both maps show varying levels of complexity, generated using a Hurst exponent of 1 for (a) and 1.5 for (b). Both maps are discretized into five undefined spectral zones, differentiated by colours. The spectra for each zone can be assigned as desired.
Remotesensing 18 00943 g002
Figure 3. Examples of synthetic spectra generated by HySIMU’s “Datacube Texturing” module. (a) Reflectance spectra of five arbitrary Alunite samples taken from the USGS spectral library. (b) Five new spectra synthesized from the five random samples in (a) from the USGS library. (c) Five new spectra synthesized only from the mean spectrum (black line in all figures) of the previous five spectra in (b).
Figure 3. Examples of synthetic spectra generated by HySIMU’s “Datacube Texturing” module. (a) Reflectance spectra of five arbitrary Alunite samples taken from the USGS spectral library. (b) Five new spectra synthesized from the five random samples in (a) from the USGS library. (c) Five new spectra synthesized only from the mean spectrum (black line in all figures) of the previous five spectra in (b).
Remotesensing 18 00943 g003
Figure 4. False-colour images of (a) a non-textured, and (b) a textured synthetic surface reflectance datacube generated from the same ground truth using HySIMU’s “Datacube texturing” module. The function adds “texture” or correlated noise to the map, imitating spatial noise in realistic optical images.
Figure 4. False-colour images of (a) a non-textured, and (b) a textured synthetic surface reflectance datacube generated from the same ground truth using HySIMU’s “Datacube texturing” module. The function adds “texture” or correlated noise to the map, imitating spatial noise in realistic optical images.
Remotesensing 18 00943 g004
Figure 5. RGB images of at-sensor radiance datacubes, (a) without and (b) with HySIMU’s “Solar and view geometry computation” module. The module adds topographic effects to the datacube by calculating solar and view angles as inputs for the RTM module. The texture introduced in the previous module (Figure 4) is intentionally excluded from these images to isolate and clearly demonstrate the effects of solar and view geometry alone.
Figure 5. RGB images of at-sensor radiance datacubes, (a) without and (b) with HySIMU’s “Solar and view geometry computation” module. The module adds topographic effects to the datacube by calculating solar and view angles as inputs for the RTM module. The texture introduced in the previous module (Figure 4) is intentionally excluded from these images to isolate and clearly demonstrate the effects of solar and view geometry alone.
Remotesensing 18 00943 g005
Figure 6. (a) RGB images of an at-sensor radiance datacube generated from identical surface reflectance ground truth and sensor parameters but using different integrated radiance computation modules in HySIMU (as labelled). (b) Radiance spectra of the selected point (red-white bullseye in (a)) for each corresponding image. SAM analysis of all three images produces an overall mean score of 6% after normalization, indicating highly correlated spectra.
Figure 6. (a) RGB images of an at-sensor radiance datacube generated from identical surface reflectance ground truth and sensor parameters but using different integrated radiance computation modules in HySIMU (as labelled). (b) Radiance spectra of the selected point (red-white bullseye in (a)) for each corresponding image. SAM analysis of all three images produces an overall mean score of 6% after normalization, indicating highly correlated spectra.
Remotesensing 18 00943 g006
Figure 7. RGB images of at-sensor radiance datacube generated from the same surface reflectance ground truth and sensor parameters using (a) simplified PSF, and (b) SR2 filter, along with a zoomed-in inset on each image to highlight the differences. These images demonstrate the different levels of “blurriness” produced by both PSF options included within HySIMU’s “Spatial Resampling” module, which correlate to the varying PSF sizes used by each function.
Figure 7. RGB images of at-sensor radiance datacube generated from the same surface reflectance ground truth and sensor parameters using (a) simplified PSF, and (b) SR2 filter, along with a zoomed-in inset on each image to highlight the differences. These images demonstrate the different levels of “blurriness” produced by both PSF options included within HySIMU’s “Spatial Resampling” module, which correlate to the varying PSF sizes used by each function.
Remotesensing 18 00943 g007
Figure 8. (a) Spectral endmembers of the 100 × 100 pixel subset of the AVIRIS 1992 Jasper Ridge (JR) reflectance image (e) classified by [62,63,64] and (b) the abundance map derived from those endmembers from the same studies. (c) Ground Truth 1 (GT1), a 100 × 100-pixel simplified ground truth map derived from (b). (d) Ground Truth 2 (GT2), an upscaled 200 × 200 pixels ground truth map containing an additional three spectral classes, all mixtures of Tree and Soil endmembers with different ratios. (f) An RGB image of the at-sensor reflectance datacube REF1 simulated by HySIMU using GT1 (c) as ground truth and AVIRIS-based parameters. (g) An RGB image of the at-sensor reflectance datacube REF2 simulated by HySIMU using GT2 (d) as the ground truth and the same AVIRIS-based parameters as (f).
Figure 8. (a) Spectral endmembers of the 100 × 100 pixel subset of the AVIRIS 1992 Jasper Ridge (JR) reflectance image (e) classified by [62,63,64] and (b) the abundance map derived from those endmembers from the same studies. (c) Ground Truth 1 (GT1), a 100 × 100-pixel simplified ground truth map derived from (b). (d) Ground Truth 2 (GT2), an upscaled 200 × 200 pixels ground truth map containing an additional three spectral classes, all mixtures of Tree and Soil endmembers with different ratios. (f) An RGB image of the at-sensor reflectance datacube REF1 simulated by HySIMU using GT1 (c) as ground truth and AVIRIS-based parameters. (g) An RGB image of the at-sensor reflectance datacube REF2 simulated by HySIMU using GT2 (d) as the ground truth and the same AVIRIS-based parameters as (f).
Remotesensing 18 00943 g008
Figure 9. NIR versus Red scattergrams for (a) JR, (b) REF1, and (c) REF2. These scattergrams illustrate the degree of spectral mixing in the corresponding reflectance images, with JR exhibiting the strongest mixing.
Figure 9. NIR versus Red scattergrams for (a) JR, (b) REF1, and (c) REF2. These scattergrams illustrate the degree of spectral mixing in the corresponding reflectance images, with JR exhibiting the strongest mixing.
Remotesensing 18 00943 g009
Figure 10. Scattergrams of PC1 values from JR (x-axis) plotted against PC1 from REF1 (a) and REF2 (b) (y-axis). All PC1 values are normalized for consistent scaling. The resulting R2 values of 0.879 and 0.869 reflect strong linear relationships between the images.
Figure 10. Scattergrams of PC1 values from JR (x-axis) plotted against PC1 from REF1 (a) and REF2 (b) (y-axis). All PC1 values are normalized for consistent scaling. The resulting R2 values of 0.879 and 0.869 reflect strong linear relationships between the images.
Remotesensing 18 00943 g010
Table 1. AVIRIS-based Jasper Ridge (JR) simulation parameters used in the validation study.
Table 1. AVIRIS-based Jasper Ridge (JR) simulation parameters used in the validation study.
ParameterValue
SensorAVIRIS-based
Bands224
Spectral range370–2500 nm
Flight altitude20,000 m
Spatial resolution20 m
Acquisition date2 September 1992
Acquisition time12:00:00 PST
View azimuth angle90°
View zenith angle
Image outputReflectance
RTMlibRadtran
Atmospheric profilemid-latitude summer
Aerosol profilerural
PSF optionsimplified
Platform speed206 m/s
Sensor FOV36°
Sensor integration time0.087 s
Table 2. HySIMU computing performance metrics for REF1 and REF2 simulations. The MaxRSS parameter represents the maximum resident set size and indicates the peak amount of memory used by any process during the simulation.
Table 2. HySIMU computing performance metrics for REF1 and REF2 simulations. The MaxRSS parameter represents the maximum resident set size and indicates the peak amount of memory used by any process during the simulation.
REF1REF2
GT image size100 × 100 pixels200 × 200 pixels
Spectral bands224
ProcessorIntel® Xeon® Processor E7-8867 v3 @ 2.5 GHz
Cores6464
Runtime01:02:5903:57:12
CPU time2–19:10:5610–16:32:00
MaxRSS35.36 GB35.97 GB
Table 3. Comparison metrics rs and mean SSIM between JR and REF1 and REF2.
Table 3. Comparison metrics rs and mean SSIM between JR and REF1 and REF2.
JR–REF1JR–REF2
rs0.7400.741
Mean SSIM0.4600.404
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Atarita, F.; Braun, A. HySIMU: An Open-Source Toolkit for Hyperspectral Remote Sensing Forward Modelling. Remote Sens. 2026, 18, 943. https://doi.org/10.3390/rs18060943

AMA Style

Atarita F, Braun A. HySIMU: An Open-Source Toolkit for Hyperspectral Remote Sensing Forward Modelling. Remote Sensing. 2026; 18(6):943. https://doi.org/10.3390/rs18060943

Chicago/Turabian Style

Atarita, Fadhli, and Alexander Braun. 2026. "HySIMU: An Open-Source Toolkit for Hyperspectral Remote Sensing Forward Modelling" Remote Sensing 18, no. 6: 943. https://doi.org/10.3390/rs18060943

APA Style

Atarita, F., & Braun, A. (2026). HySIMU: An Open-Source Toolkit for Hyperspectral Remote Sensing Forward Modelling. Remote Sensing, 18(6), 943. https://doi.org/10.3390/rs18060943

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop