Next Article in Journal
3D LiDAR Multi-Object Tracking with Short-Term and Long-Term Multi-Level Associations
Next Article in Special Issue
A Cloud Water Path-Based Model for Cloudy-Sky Downward Longwave Radiation Estimation from FY-4A Data
Previous Article in Journal
Integrating Earth Observation with Stream Health and Agricultural Activity
Previous Article in Special Issue
The Uncertainty Analysis of the Entrance Pupil Irradiance for a Moon-Based Earth Radiation Observation Instrument
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Multispectral Camera Suite for the Observation of Earth’s Outgoing Radiative Energy

1
Royal Observatory of Belgium, Avenue Circulaire 3, 1180 Brussels, Belgium
2
Brussels Photonics (B-PHOT), Applied Physics and Photonics Department, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium
3
Key Laboratory of Thermo-Fluid Science and Engineering, Ministry of Education, School of Energy and Power Engineering, Xi’an Jiaotong University, Xi’an 710049, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Remote Sens. 2023, 15(23), 5487; https://doi.org/10.3390/rs15235487
Submission received: 19 October 2023 / Revised: 17 November 2023 / Accepted: 22 November 2023 / Published: 24 November 2023
(This article belongs to the Special Issue Earth Radiation Budget and Earth Energy Imbalance)

Abstract

:
As part of the Earth Climate Observatory space mission concept for the direct observation from space of the Earth Energy Imbalance, we propose an advanced camera suite for the high-resolution observation of the Total Outgoing Radiation of the Earth. For the observation of the Reflected Solar Radiation, we propose the use of two multispectral cameras covering the range from 400 to 950 nm, with a nadir resolution of 1.7 km, combined with a high-resolution RGB camera, with a nadir resolution of 0.57 km. For the observation of the Outgoing Longwave Radiation, we propose the use of six microbolometer cameras, with each a spectral bandwidth of 1 μm in the range from 8 to 14 μm, with a nadir resolution of 2.2 km.

1. Introduction

The Earth Energy Imbalance (EEI) is defined as the small difference between the incoming energy received by the Earth from the Sun and the outgoing energy lost by the Earth to space. Both the incoming solar and the terrestrial outgoing energy are of the order of 340 W/m2 at the global annual mean level [1], while the EEI is of the order of 0.9 W/m2 [2,3]. The EEI is accumulated in the Earth’s climate system, particularly in the oceans which have a high heat capacity, and results in global temperature rise.
The monitoring of the EEI is of prime importance for a predictive understanding of climate change [4,5].
Despite its fundamental importance, the EEI is currently poorly measured from space, due to two fundamental challenges.
The first fundamental challenge is that the EEI is the relatively small difference between two opposite terms with large and nearly equal amplitude. Currently, incoming solar radiation and outgoing terrestrial radiation are measured with separate instruments, which means that their calibration errors are added, and overwhelm the signal to be measured. The current error on the direct measurement of the EEI is of the order of 5 W/m2 [1], significantly larger than the signal to be measured of the order of 0.9 W/m2. In order to make significant progress in this challenge, a differential measurement using identically designed, intercalibrated instruments—so-called wide-field-of-view (WFOV) radiometers—to measure both the incoming solar radiation and the outgoing terrestrial radiation is needed [6].
The second fundamental challenge is that outgoing terrestrial radiation has a systematic diurnal cycle [7,8,9]. From 2003 to 2023, the diurnal cycle of the outgoing terrestrial radiation was sampled from the so-called morning and afternoon Sun synchronous orbits, complemented by geostationary imagers [10,11]; recently, the sampling from the morning orbit had to be abandoned [12]. The Geostationary Earth Radiation Budget Experiment [13] resolves the detailed diurnal variations in the ERB but lacks global coverage for the connection to the EEI. The sampling of the global diurnal cycle can be improved by using two orthogonal 90° inclined orbits [14] which provide both global coverage and a statistical sampling of the full diurnal cycle at the seasonal (3-month) time scale. For calculating the global annual mean EEI, the satellite measurements are first averaged temporally per latitude/longitude gridbox; next, the gridbox yearly averages are averaged spatially with an appropriate weighting factor to obtain the global annual mean. Details are provided in Reference [14].
The wide-field-of-view radiometer will make accurate low-spatial-resolution measurements of the Total Outgoing Radiation (TOR) of the Earth. Auxiliary visible and thermal imagers will be used to increase the spatial resolution of the radiometer observations and to separate the TOR spectrally in the Reflected Solar Radiation (RSR) and the Outgoing Longwave Radiation (OLR).
The so-called narrowband multispectral imagery from the cameras can be combined using, e.g., linear regression techniques, to estimate the so-called broadband radiance, either the OLR radiance from the thermal cameras or the RSR radiance from the visible cameras. The narrowband-to-broadband conversion principle has a long heritage from missions such as AVHRR [15], HIRS [16], MODIS [17], and SEVIRI [18].
After the narrowband-to-broadband conversion, the cameras measure the angular distribution of the OLR and RSR radiance, measured accurately via the radiometer. This radiance distribution can be used, e.g., for correcting small imperfections in the cosine response of the radiometer. In turn, the radiometer can be used for a bias correction of the camera-derived broadband radiances.
With the bias-corrected directional radiance distribution, it is possible to derive additional secondary objective data products, e.g., maps of fluxes. This is routinely performed in several satellite products, for instance, CERES [19] and GERB [20], and is achieved through the development of Angular Dependency Models (ADMs) [21]. Such ADMs can be derived from the multiangular views of multispectral cameras. Beyond these traditional approaches to mapping radiance to flux, and given the intended 2036 launch time frame, it may be interesting to develop new algorithms based on deep learning (DL) [22,23], which is currently seeing many applications in the remote sensing domain.
Figure 1 shows a multiannual mean TOR at 1° resolution, Figure 2 shows the corresponding RSR, and Figure 3 shows the corresponding OLR.
A reference design of the visible imager is described in [24].
A reference design of the thermal imager is described in [25].
A new space mission concept, called the Earth Climate Observatory (ECO), for the accurate and stable monitoring of the EEI is currently elaborated upon, building further on the basic concepts in [6,24,25].
In this paper, we describe an advanced multispectral camera suite with performance that goes beyond [24,25]’s reference designs in terms of spectral information content. In Section 2, we describe the advanced shortwave camera suite. In Section 3, we describe the optical design for a new visible high-resolution camera used in the advanced shortwave camera suite. In Section 4, we develop the advanced longwave camera suite. We discuss our results in Section 5 and provide our conclusions in Section 6.

2. Shortwave Camera Suite

In [24]’s reference design, the shortwave (SW) camera—also called solar camera—consists of an RGB CMOS detector array with a wide-field-of-view lens allowing us to view the Earth from limb to limb. The field of view is 140 °. The wide-field-of-view (WFOV) lens design is illustrated in Figure 4. The nadir resolution is 2.2 km. The f-number is 3. The used RGB CMOS spectral responses are illustrated in Figure 5. On a stand alone basis, the RSR can be estimated from an RGB CMOS imager through spectral regression with a noise level of 3%.
The only on-board calibration source for the SW camera suite will be a shutter, allowing us to verify the dark current and also allowing us to shield the camera from direct solar illumination during solar pointing. The camera spectral gains and dark current will be determined preflight during ground calibration. The stability of the RGB spectral responses and the gains will be monitored and, if needed, adjusted in flight through vicarious calibration monitoring stable Earth targets similar to those in [26,27,28]. The stability of the shortwave cameras does not influence the accuracy of the global annual mean EEI, which depends on the radiometer.
The spectral regression noise level, through which the RSR broadband radiance can be estimated, can be improved by increasing the number of narrowband spectral channels used as input for the regression. In [29], a commercially available CMOS sensor with 4 × 4 filter bands in the VIS (470–620 nm) is described. Also, a commercially available CMOS sensor with 5 × 5 filter bands in the VIS-NIR (650–975 nm) is described.
The newly proposed ECO Shortwave Camera Suite (SCS) consists of three separate shortwave (SW) cameras. Cameras 2 and 3 can be based on the same CMOSIS CMV2000 sensor array. Since the size of the detector is comparable to that of the detector used in [24], the reference wide-field-of-view lens from Figure 4 can be reused. Camera 1 can be realized using the higher-resolution CMOSIS CMV12000 sensor array. Since the size of the CMV12000 sensor is about three times larger than the detector size used in [24], a new lens design is appropriate and will be presented in Section 3. The three shortwave cameras have different spectral responses, by using different filters directly integrated on the CMOS chip, with spatial patterns illustrated in Figure 6.
The three ECO SW cameras are the following:
1.
ECO SW Camera 1 will be the RGB high-resolution SW camera, formed of a CMOS sensor, with spectral response between 400 and 1000 nm, with standard ‘RGGB’ Bayer pattern. The active area of the CMOS sensor will be 3000 × 3000 pixels, yielding a nadir resolution for a single pixel of 0.57 km and a nadir resolution for the 2 × 2 RGGB pixels of 1.1 km. The pixel pattern of SW Camera 1 is illustrated in the left part of Figure 6.
2.
ECO SW Camera 2 will be the VIS SW camera from [29]. The nadir resolution of the 4 × 4 VIS filter pattern—illustrated in the middle part of Figure 6—is 6.8 km.
3.
ECO SW Camera 3 will be the VIS-NIR SW camera from [29], using the same underlying CMOS sensor as Camera 2. The nadir resolution of the 5 × 5 VIS-NIR filter pattern—illustrated in the right part of Figure 6—is 8.5 km.
The detectors from [29], considered for Cameras 2 and 3, are also considered for the CubeMAP [30] and Hyperscout-H [31] space missions.

3. Optical Design High-Resolution SW Camera

The SW camera uses the CMOSIS CMV12000 4K CMOS detector with pixel size 5.5 μm. For the envisioned FOV of  140 , we use a circular area of 16.5 mm diameter, corresponding to 3000 pixels. For a WFOV lens with field angle  140  observing the Earth from an altitude of 700 km, the size of the nadir pixel is 0.57 km. Considering that the sensor is used with a Bayer color filter, the RMS spot size we require should be smaller than 2 pixels—corresponding to 11 μm. The barrel distortion is considered to be corrected in postprocessing.

3.1. The Design Parameters

The lens train—illustrated in Figure 7—consists of six lenses, two of which form an achromatic doublet. The doublet along with the third lens works as a chromatic aberration correction system along with focusing the image on the image plane. The main lens properties are summarized in Table 1. The lens materials were selected from the SCHOTT® catalog. The LAK14 is a crown glass with relatively high refractive index, and SF6 is a flint glass with high refractive index, as well.
There are three aspherical surfaces on the system: the first surface, the last surface, and the final surface of the achromatic doublet. This is conducted to ensure increased light collection and to reduce spherical aberrations. The initial reference design for this lens train was the one shown in Figure 4 [24].
The system was designed and optimized using the Zemax 2018 OpticStudio® software. Only a few parameters were constrained, namely, the ‘Effective focal length’, at 15.6 mm, and the lens thickness minimum at 2 mm. The initial merit function for spot size was conducted through the ‘Quick focus’ function provided by Zemax. Once a substantial percentage of criteria was fulfilled during optimization and satisfactory RMS values along with a good relative illumination were achieved, further optimization was performed using the ‘Hammer optimization’ function provided by Zemax.

3.2. Optical Performance

3.2.1. RMS Error

The RMS error—see Figure 8—shows the root mean square deviation of a bunch of rays of light from its intended spot of incidence. For a preservation of the spatial information captured via the detector at a resolution of 2 pixels, corresponding to a nadir resolution of 1.1 km, this value should be below 11 μm approximately. This is widely true for the said lens train and only reaches 12 μm for the 400 nm wavelength at the extreme  70  incidence angle.

3.2.2. Seidel Aberrations

The Seidel aberrations—see Figure 9—apart from distortion have been optimized and kept low. The distortion/barrel distortion is to be corrected on the ground as postprocessing.

3.2.3. Point Spread Function (PSF)

The optical PSF is the characterization of how a point of light is transformed when passing through an optical system. The optical PSF is related to the RMS spot size as shown in Figure 8, which mostly lies below the desired 11 μm. The average optical PSF of all wavelengths is given in Figure 10.
To evaluate the joint effect of the optical PSF and the pixel shape, the convolution of the optical PSF with the detector pixel shape is calculated. We use a layer of fixed size containing several pixels of the same type, either red/blue active pixels or green-green active pixels. The two types of pixels exist due to the presence of the Bayer filter. The individual pixels have sizes of 5.5 μm, but to emulate real pixels, an active area of 80% of the nominal pixel size, as shown in Figure 11, is used. We assume the variation in the optical PSF is negligible within the layer of pixels.
The convolution of these pixel layers with the optical PSF for the  0  incidence angle is shown in Figure 12. It can be seen that, after the convolution, individual pixels are still recognizable.
From the RMS in Figure 8, the 400 nm wavelength at  70  incidence angle and 500 nm at  38  incidence angles have high RMS spot radii. The PSFs and convolutions at these particular wavelengths and angles are examined in detail.
The optical PSF for light with a wavelength of 400 nm incident at  70  is shown in the left of Figure 13. The convolution of this optical PSF with the blue-type active pixel layer is shown in the right of Figure 13.
The optical PSF for light with a wavelength of 500 nm incident at  38  is shown in the left of Figure 14. The convolution of this optical PSF with the green-type active pixel layer is given in the right of Figure 14.
The light spreads towards the top and bottom of the pixels in the right part of Figure 13 and Figure 14. Following [24], the required minimum spatial effective resolution of the cameras is 5 km. The current design meets this requirement with ample margin.

3.3. Simulated Performance for Earth Observation

3.3.1. SNR High-Resolution 4K-RGB Camera

For Camera 1, with a nadir resolution of 570 m and a nominal satellite altitude of 700 km, Lambertian Earth leaving flux is attenuated towards the entrance of the camera by a factor of  ( 0.57 / 700 ) 2 / π = 2.11 × 10 7 . Within the optical system, with a 0° radiometric aperture diameter of 5.38 mm and a detector pixel size of 5.5 micron, the entrance flux is amplified towards the detector by a factor of  π / 4 × ( 5.38 / 5.5 × 10 3 ) 2 = 7.5 × 10 5 . Assuming no light loss, the ratio between the detector flux and the Earth flux is then  2.11 × 10 7 × 36.7 × 10 4 = 0.16 . Assuming a 50% light loss, the ratio becomes  0.16 × 0.5 = 0.08 .
The detector for Camera 1 is the CMOSIS CMV12000 CMOS detector array, with  4096 × 3072  pixels and a full well capacity of 1485 counts, corresponding to approximately 10.5 bit per pixel. For a  3000 × 3000  pixel subarray, it can be read out at a maximum frame rate of 400 fps, corresponding to a minimum readout time for a single frame of 2.5 ms. We calculate the maximum integration time before saturation occurs for the theoretical case of 100% diffuse reflection for an incident solar irradiance of 1500 W/m2. The wavelength-dependent saturation times for red, green, and blue wavelengths with peaks at 634 nm, 547 nm, and 453 nm, respectively, are 16.7 μs, 18.9 μs, and 26 μs, respectively. An average quantum efficiency of 0.45 is assumed.
Before motion blurring occurs, a further time averaging is allowed within a time period of 0.57 km/(7 km/s) = 81.4 ms, so over 81.4 ms/2.5 ms ≈ 32 frames. The SNR as a function of wavelength for 32 integrations of 15 μs each for red, green, and blue wavelengths is  47 × 10 3 41 × 10 3 , and  30 × 10 3 , respectively. If we follow the procedure of [24] to estimate the RSR from the RGB camera measurements, the quantization errors corresponding to these SNR values cause an error on the RSR of 0.04 W/m2. This 0.04 W/m2 quantization error is small compared to [24]’s spectral regression error of 3%, which for the reference scene of 1500 W/m2 corresponds to 3% × 1500 W/m2 = 45 W/m2.

3.3.2. SNR Multispectral Cameras

For Cameras 2 and 3, with a nadir resolution of 1.71 km and a nominal satellite altitude of 700 km, Lambertian Earth leaving flux is attenuated towards the entrance of the camera by a factor of  ( 1.71 / 700 ) 2 / π = 1.9 × 10 6 . Within the optical system, with a 0° radiometric aperture diameter of 1.14 mm and a detector pixel size of 5.5 micron, the entrance flux is amplified towards the detector by a factor of  π / 4 × ( 1.14 / 5.5 × 10 3 ) 2 = 3.37 × 10 4 . The ratio between the detector flux and the Earth flux is then  1.9 × 10 6 × 3.37 × 10 4 = 0.064  without light loss and  0.064 × 0.5 = 0.032  with 50% light loss.
The detector for Cameras 2 and 3 is the CMOSIS CMV2000 CMOS detector array, with 1024 × 2048 pixels and a full well capacity of 1012.5 counts, corresponding to ≈10 bit per pixel. For a 1000 × 1000 pixel subarray, it can be read out at a maximum frame rate of 680 fps, corresponding to a minimum readout time for a single frame of 1.47 ms. Before motion blurring occurs, a further time averaging is allowed within a time period of 1.71 km/7 km/s = 243 ms, so over 243 ms/1.47 ms ≈ 166 frames. The Signal-to-Noise Ratio (SNR) as a function of wavelength for 166 readouts of 400 μs each is illustrated in Figure 15. For the least sensitive wavelength at 950 nm, the SNR for a single readout is 771 and the SNR for 166 readouts is  128 × 10 3 .

4. Longwave Camera Suite

4.1. Multispectral Thermal Cameras

A reference design of the thermal imager is described in [25]. The thermal imager consists of a microbolometer detector array with a WFOV lens allowing us to view the Earth from limb to limb. The WFOV lens design is illustrated in Figure 16. The nadir sampling distance is 2.2 km, and the nadir effective resolution is 4.4 km. Using a single wavelength band of 8–14 micron, on a stand alone basis the OLR can be estimated from the thermal imager with an accuracy of 5%.
Longwave (LW) cameras are equipped with a shutter with known temperature and known high emissivity. The shutter is made of aluminum covered with Nextel 811-21 black paint [32]. This shutter will act as a flat black body. Combined with the deep space view, a two-point in-flight calibration for the longwave cameras will be implemented. The shutter view will be used for the characterization of the thermal offset as a function of the lens temperature. The deep space view will be used for an in-flight verification of the gain, which will also be measured on the ground before flight.
For improving the spectral regression OLR noise level, the number of ECO LW channels can be increased. This will be conducted by using multiple cameras, with filters inserted in front of the microbolometer detector array. In [33], the TIRI/HERA space instrument is described using the same detector as in [25] and a filter wheel with six filters each having a bandwidth of approximately 1 μm. For ECO, we will adopt similar filters, listed in Table 2.
The six ECO filter spectral responses are also similar to the filter responses used for the SEVIRI/MSG [34] and FCI/MTG [35] instruments.
The ECO LW Camera Suite (LCS) will be formed of six separate LW cameras, each consisting of the same microbolometer array and WFOV lens and each equipped with a different filter realizing the spectral responses from Table 2. For the microbolometer array, the Lynred 1024 × 768 array with 17 μm pixel pitch [36] can be used.
The LW spectral responses are illustrated in Figure 17.

4.2. Noise-Equivalent Differential Temperature (NEDT)

For thermal cameras with a nadir sampling distance of 2 km and a nominal satellite altitude of 700 km, Lambertian Earth leaving flux is attenuated towards the entrance of the camera by a factor of (2/700)2 = 8.2 ×  10 6 . Within the optical system, with a 0° radiometric aperture diameter of 6.61 mm and a detector pixel size of 17 micron, the entrance flux is amplified towards the detector by a factor of  π /4 × (6.61/17 ×  10 3 )2 = 119 ×  10 3 . Assuming no light loss, the ratio between the detector flux and the Earth flux is then 8.2 ×  10 6 × 119 ×  10 3  = 0.97. Assuming 50% light loss, the ratio becomes 0.97 × 0.5 = 0.49.
The Lynred 1024 × 768 microbolometer detector array has an NEDT of 50 mK for f/1 optics at 300 K and at 30 Hz. For the above calculated optical magnification of 0.49, the NEDT will be increased to 50 mK/ 0.49  = 72 mK. Before motion blurring occurs, the microbolometer signal can be integrated during 4.4 km/7 km/s = 0.63 s. After 0.63 s averaging, the NEDT will be reduced to 72 mK/0.63 s  × 30 Hz  = 17 mK for a hypothetical microbolometer without a spectral filter. From the ECO spectral filters—illustrated in Figure 17 and quantized via Equation (3)—the fractions measured for an ideal black body at 300 K are calculated, and the respective resulting NEDTs are calculated and listed in Table 3.

4.3. Longwave Spectral Regression

In [25], the OLR was estimated from a single narrowband irradiance  I n b  measurement via the following steps:
1.
Conversion of the narrowband irradiance  I n b  to a narrowband brightness temperature  T n b .
2.
Conversion of the narrowband brightness temperature  T n b  to a broadband brightness temperature  T b b .
3.
Conversion of the broadband brightness temperature  T b b  to the OLR.
Here, we will generalize this approach using six narrowband irradiances  I n b , i , for i = 1, …, 6. The six narrowband irradiances are obtained from the six thermal cameras with spectral responses illustrated in Figure 17. Using the sigmoid function  s ( x ) , with
s ( x ) = 1 1 + e x
we define the synthetic microbolometer spectral response  f m b ( λ )
f m b ( λ ) = s ( ( λ 7.5   μ m ) 4 ) s ( λ 13   μ m )
with  λ  being the wavelength, and we define the synthetic narrowband spectral responses  f n b , i ( λ ) , for i = 1, …, 6, as
f n b , 1 ( λ ) = f m b ( λ ) s ( ( λ 8   μ m ) 10 ) s ( ( λ 9   μ m ) 10 ) f n b , 2 ( λ ) = f m b ( λ ) s ( ( λ 9   μ m ) 10 ) s ( ( λ 10   μ m ) 10 ) f n b , 3 ( λ ) = f m b ( λ ) s ( ( λ 10   μ m ) 10 ) s ( ( λ 11   μ m ) 10 ) f n b , 4 ( λ ) = f m b ( λ ) s ( ( λ 11   μ m ) 10 ) s ( ( λ 12   μ m ) 10 ) f n b , 5 ( λ ) = f m b ( λ ) s ( ( λ 12   μ m ) 10 ) s ( ( λ 13   μ m ) 10 ) f n b , 6 ( λ ) = f m b ( λ ) s ( ( λ 13   μ m ) 10 ) s ( ( λ 14   μ m ) 10 )
For a given general spectral irradiance distribution  I ( λ ) , the narrowband irradiance  I n b , i  is given by
I n b , i = I ( λ ) f n b , i ( λ ) d λ
And the broadband irradiance, also known as the OLR, is given by
O L R = I ( λ ) d λ
For the specific case of black body radiation, the irradiance distribution is given by the Planck curve and depends only on the temperature T of the black body,  I ( λ ) = I B B ( T , λ ) . We define the brightness temperature  T n b , i  from an exponential fit to this black body radiation:
I B B ( T , λ ) f n b , 1 ( λ ) d λ ( a i T n b , i ) b i
And we define the broadband brightness temperature  T b b  from Planck’s law:
O L R = σ T b b 4
For the general narrowband radiance  I n b , i  given by Equation (4), the narrowband brightness temperature  T n b , i  is obtained by inverting the exponential fit from Equation (6):
T n b , i = ( I n b , i ) 1 b i a i
And the broadband brightness temperature  T b  is obtained by inverting Planck’s law:
T b b = ( O L R σ ) 1 4
Similar to [25], we use the libradtran radiative transfer simulation tool to simulate the narrowband irradiances and OLR for 15 reference scenes, listed in column 1 of Table 4.
For the ensemble of these 15 reference scenes, we regress  T b b  as a linear function of  T n b , i :
T b b c 0 + i = 1 6 c i T n b , i
Next, we evaluate the OLR from  T b b  using Planck’s law—using Equation (9). The resulting OLR regression error is given in column 3 of Table 4. It lies within the range +/−0.61%.
The general recipe for estimating the OLR from measured narrowband irradiances  I n b , i , for i = 1, …, 6, is:
1.
Convert  I n b , i  to  T n b , i  using Equation (8).
2.
Estimate  T b b  from  I n b , i , using Equation (10).
3.
Convert  T b b  to the OLR, using Equation (7).

5. Discussion

The monitoring of the Earth Energy Imbalance is one of the most critical aspects of mastering our planet’s climate change [4,5] and is the main mission objective of the Earth Climate Observatory (ECO) space mission concept. The ECO payload concept consists of four WFOV radiometers as main instruments, targeting an accurate differential measurement of the EEI as the difference of the ISR and the TOR, at a low spatial resolution of approximately 6000 km for a single measurement of the TOR. As auxiliary ECO instruments, cameras are proposed, allowing us to spectrally separate the TOR into RSR and OLR and to resolve the RSR and the OLR at a spatial resolution of at least 5 km at nadir, allowing us to distinguish clear-sky from cloudy scenes. In [24], a 1500 × 1500 RGB CMOS camera was proposed, providing a nadir resolution of approximately 1 km for a single pixel and allowing us to estimate the RSR with a spectral regression error of 3% at the 2 × 2 pixel level. In [25], a 750 × 750 microbolometer camera was proposed, providing a nadir resolution of approximately 2 km and allowing us to estimate the OLR with a spectral regression error of 5% at the pixel level. In this paper, we are introducing some improvements in the ECO camera concept, building further on [24,25]’s baseline designs.
For a SW camera, in [24], a single RGB camera—with pixel size of 3.2 μm and image diameter of 1536 pixels, corresponding to 1536 × 3.2 μm = 4.9 mm—was used. The accuracy by which high-resolution RSR radiance can be estimated from this camera is limited to 3% by the information contained in the three spectral channels, the red, green, and blue channels with typical spectral responses shown in Figure 5.
Here, we have proposed a dramatic improvement in the number of spectral channels by adopting the IMEC multispectral cameras described in [29], which we adopt as our SW Cameras 2 and 3. Cameras 2 and 3 provide a total of 41 spectral channels, in the form of a 4 × 4 VIS and 5 × 5 VIS-NIR Multispectral Filter Array (MSFA), respectively, as illustrated in Figure 6. IMEC multispectral detectors are based on the CMOSIS CMV2000 detector, with a pixel size of 5.5 μm and image diameter of 1000 pixels, corresponding to 1000 × 5.5 μm = 5.5 mm. For the SW camera lens, we can use the optical design of [24], with a scaling of 5.5/4.9 = 1.1.
The use of MSFA-based cameras requires a demosaicing algorithm, for which deep learning (DL)-based methods are currently an active area of research [37,38]. The quality of demosaicing will improve if we can provide additional high-resolution input information. For this purpose, we propose to include SW Camera 1, a high-resolution RGB camera based on the CMOSIS CMV12000 detector, with a pixel size of 5.5 μm and image diameter of 3000 pixels, corresponding to 3000 × 5.5 μm = 16.5 mm. For this camera, we need a new optical design, which is presented in Section 3 and Figure 7.
For a thermal or longwave (LW) camera, in [25], a single ‘broadband’ microbolometer camera with spectral response between 8 and 14 μm—see the purple curve in Figure 17—is proposed. The corresponding OLR spectral regression error is 5%. Here, we propose to use six different copies of this basic camera—using the same microbolometer detector array and WFOV LW lens, see Figure 16—with six different filters in front of it. Inspired by the TIRI instrument [33] to be launched as part of the Hera mission [31], we choose six adjacent filters with a width of 1 μm, with spectral responses illustrated in Figure 17.
Extending the methodology of [25], with the newly proposed six-channel LW camera concept, in Section 4.3, we evaluate the OLR radiance spectral regression error to be 0.6%. Thus, comparing our newly proposed LW camera design to the one of [25], by going from 1 to 6 channels, we reduce the OLR spectral regression error from 5% to 0.6%.
A detailed analysis of the optical performance of the newly designed high-resolution SW camera, as well as an evaluation of the SNR of all SW cameras and the NEDT of all LW cameras, is given in Section 3.2. The optical performance is well above what is required for the ECO mission’s scientific objectives.

6. Conclusions

We propose in this work an update to the work previously performed for the ECO space mission to measure what can be considered as the most essential of all climate variables: the Earth Energy Imbalance. The proposed payload consists of a combination of highly accurate but low-resolution radiometers measuring incoming solar radiation and total outgoing terrestrial radiation in a differential way, enhanced with lightweight uncooled camera systems aiming to separate the total outgoing terrestrial radiation spectrally in reflected solar, or shortwave, and emitted thermal, or longwave, radiation, and to increase the spatial resolution to 5 km at nadir or better.
The primary mission objective is to measure the global annual mean EEI with an accuracy of 1 W/m2 and a stability of 0.2 W/m2dec; in principle, this objective can be reached with stand alone radiometers. The addition of auxiliary cameras will allow for a secondary mission objective of deriving high-resolution maps of the RSR and OLR. These secondary mission objectives will require the development of narrowband-to-broadband regressions and angular dependency modeling. Although not critical for the primary mission objective, a calibration of the cameras is foreseen to serve the secondary mission objectives. For the calibration of the visible cameras, we foresee deep space views and vicarious calibration. For the calibration of the thermal cameras, we foresee deep space calibrations and a shutter acting as a flat-plate black body.
Compared to our earlier proposal for the camera system, which consisted of a single SW camera and a single LW camera, we now propose a suite of three SW cameras and six LW cameras, with strongly improved spectral information content. The 2 × 2 pixel nadir spatial resolution of the highest-resolution SW camera has improved from 2.2 km in the earlier design to 1.1 km in the current design. For the LW cameras, the number of spectral channels has increased from 1 to 6, and the OLR spectral regression error has gone down from 5% to 0.6%. For the SW cameras, the number of spectral channels has increased from 3 to 44.

Author Contributions

Methodology, S.D.; Formal analysis, A.A.A.N. and Y.Z.; Writing—original draft, A.A.A.N.; Writing—review & editing, S.D. and L.S.; Supervision, S.D. All authors have read and agreed to the published version of the manuscript.

Funding

The work performed by Y.Z. was supported by the China Scholarship Council, funding a one-year stay of Y.Z. at the Royal Observatory of Belgium.

Data Availability Statement

The data generated in this study are available upon request to the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
EEIEarth Energy Imbalance
FWHMFull Width at Half Maximum
OLROutgoing Longwave Radiation
RSRReflected Solar Radiation
WFOVwide field of view
VISVisible
VIS-NIRVisible and Near-infrared
CMOScomplementary metal oxide semiconductor
FOVfield of view
RMSroot mean square
PSFPoint Spread Function
SNRSignal-to-Noise Ratio
LSBleast significant bit
NEDTNoise-Equivalent Differential Temperature

References

  1. Dewitte, S.; Clerbaux, N. Measurement of the Earth radiation budget at the top of the atmosphere—A review. Remote Sens. 2017, 9, 1143. [Google Scholar] [CrossRef]
  2. Trenberth, K.; Fasullo, J.; Von Schuckmann, K.; Cheng, L. Insights into Earth’s energy imbalance from multiple sources. J. Clim. 2016, 29, 7495–7505. [Google Scholar] [CrossRef]
  3. Von Schuckmann, K.; Cheng, L.; Palmer, M.; Hansen, J.; Tassone, C.; Aich, V.; Adusumilli, S.; Beltrami, H.; Boyer, T.; Cuesta-Valero, F.; et al. Heat stored in the Earth system: Where does the energy go? Earth Syst. Sci. Data 2020, 12, 2013–2041. [Google Scholar] [CrossRef]
  4. Hansen, J.; Nazarenko, L.; Ruedy, R.; Sato, M.; Willis, J.; Del Genio, A.; Koch, D.; Lacis, A.; Lo, K.; Menon, S.; et al. Earth’s energy imbalance: Confirmation and implications. Science 2005, 308, 1431–1435. [Google Scholar] [CrossRef] [PubMed]
  5. Von Schuckmann, K.; Palmer, M.; Trenberth, K.; Cazenave, A.; Chambers, D.; Champollion, N.; Hansen, J.; Josey, S.; Loeb, N.; Mathieu, P.; et al. An imperative to monitor Earth’s energy imbalance. Nat. Clim. Chang. 2016, 6, 138–144. [Google Scholar] [CrossRef]
  6. Schifano, L.; Smeesters, L.; Geernaert, T.; Berghmans, F.; Dewitte, S. Design and analysis of a next-generation wide field-of-view earth radiation budget radiometer. Remote Sens. 2020, 12, 425. [Google Scholar] [CrossRef]
  7. Rutan, D.; Smith, G.; Wong, T. Diurnal Variations of Albedo Retrieved from Earth Radiation Budget Experiment Measurements. J. Appl. Meteorol. Climatol. 2014, 53, 2747–2760. [Google Scholar] [CrossRef]
  8. Smith, G.; Rutan, D. The Diurnal Cycle of Outgoing Longwave Radiation from Earth Radiation Budget Experiment Measurements. J. Atmos. Sci. 2003, 60, 1529–1542. [Google Scholar] [CrossRef]
  9. Gristey, J.J.; Chiu, J.C.; Gurney, R.J.; Morcrette, C.J.; Hill, P.G.; Russell, J.E.; Brindley, H.E. Insights into the diurnal cycle of global Earth outgoing radiation using a numerical weather prediction model. Atmos. Chem. Phys. 2018, 18, 5129–5145. [Google Scholar] [CrossRef]
  10. Doelling, D.; Keyes, D.; Nordeen, M.; Morstad, D.; Nguyen, C.; Wielicki, B.; Young, D.; Sun, M. Geostationary enhanced temporal interpolation for CERES flux products. J. Atmos. Ocean. Technol. 2013, 30, 1072–1090. [Google Scholar] [CrossRef]
  11. Loeb, N.; Doelling, D.; Wang, H.; Su, W.; Nguyen, C.; Corbett, J.; Liang, L.; Mitrescu, C.; Rose, F.; Kato, S. Clouds and the earth’s radiant energy system (CERES) energy balanced and filled (EBAF) top-of-atmosphere (TOA) edition-4.0 data product. J. Clim. 2018, 31, 895–918. [Google Scholar] [CrossRef]
  12. Kato, S.; Loeb, N.; Rose, F.; Thorsen, T.; Rutan, D.; Ham, S.H.; Doelling, D. Earth Radiation Budget Climate Record Composed of Multiple Satellite Observations; GFZ German Research Centre for Geosciences: Berlin, Germany, 2023; p. IUGG23-4506. [Google Scholar]
  13. Harries, J.E.; Russell, J.E.; Hanafin, J.A.; Brindley, H.; Futyan, J.; Rufus, J.; Kellock, S.; Matthews, G.; Wrigley, R.; Last, A.; et al. The Geostationary Earth Radiation Budget Project. Bull. Am. Meteorol. Soc. 2005, 86, 945–960. [Google Scholar] [CrossRef]
  14. Hocking, T.; Mauritsen, T.; Megner, L. Sampling strategies for Earth Energy Imbalance measurements using a satellite radiometer. In Proceedings of the Earth Energy Imbalance Assessment Workshop, Frascati, Italy, 15–17 May 2023. [Google Scholar]
  15. Li, Z.; Leighton, H.G. Narrowband to Broadband Conversion with Spatially Autocorrelated Reflectance Measurements. J. Appl. Meteorol. Climatol. 1992, 31, 421–432. [Google Scholar] [CrossRef]
  16. Lee, H.T.; Gruber, A.; Ellingson, R.; Laszlo, I. Development of the HIRS Outgoing Longwave Radiation Climate Dataset. J. Atmos. Ocean. Technol. 2007, 24, 2029–2047. [Google Scholar] [CrossRef]
  17. Wang, D.; Liang, S. Estimating high-resolution top of atmosphere albedo from Moderate Resolution Imaging Spectroradiometer data. Remote Sens. Environ. 2016, 178, 93–103. [Google Scholar] [CrossRef]
  18. Urbain, M.; Clerbaux, N.; Ipe, A.; Tornow, F.; Hollmann, R.; Baudrez, E.; Velazquez Blazquez, A.; Moreels, J. The CM SAF TOA Radiation Data Record Using MVIRI and SEVIRI. Remote Sens. 2017, 9, 466. [Google Scholar] [CrossRef]
  19. Wielicki, B.A.; Barkstrom, B.; Harrison, E.; Lee, R.B., III; Smith, L.G.; Cooper, J. Clouds and the Earth’s Radiant Energy System (CERES): An Earth Observing System Experiment. Bull. Am. Meteorol. Soc. 1996, 77, 853–868. [Google Scholar] [CrossRef]
  20. Dewitte, S.; Gonzalez, L.; Clerbaux, N.; Ipe, A.; Bertrand, C.; De Paepe, B. The Geostationary Earth Radiation Budget Edition 1 data processing algorithms. Adv. Space Res. 2008, 41, 1906–1913. [Google Scholar] [CrossRef]
  21. Loeb, N.; Manalo-Smith, N.; Kato, S.; Miller, W.F.; Gupta, S.K.; Minnis, P.; Wielicki, B. Angular Distribution Models for Top-of-Atmosphere Radiative Flux Estimation from the Clouds and the Earth’s Radiant Energy System Instrument on the Tropical Rainfall Measuring Mission Satellite. Part I: Methodology. J. Appl. Meteorol. Climatol. 2003, 42, 240–265. [Google Scholar] [CrossRef]
  22. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
  23. Dewitte, S.; Cornelis, J.P.; Müller, R.; Munteanu, A. Artificial Intelligence Revolutionises Weather Forecast, Climate Monitoring and Decadal Prediction. Remote Sens. 2021, 13, 3209. [Google Scholar] [CrossRef]
  24. Schifano, L.; Smeesters, L.; Berghmans, F.; Dewitte, S. Optical system design of a wide field-of-view camera for the characterization of earth’s reflected solar radiation. Remote Sens. 2020, 12, 2556. [Google Scholar] [CrossRef]
  25. Schifano, L.; Smeesters, L.; Berghmans, F.; Dewitte, S. Wide-field-of-view longwave camera for the characterization of the earth’s outgoing longwave radiation. Sensors 2020, 21, 4444. [Google Scholar] [CrossRef]
  26. Wu, A.; Xiong, X.; Doelling, D.; Morstad, D.; Angal, A.; Bhatt, R. Characterization of Terra and Aqua MODIS VIS, NIR, and SWIR spectral bands’ calibration stability. IEEE Trans. Geosci. Remote Sens. 2012, 51, 4330–4338. [Google Scholar] [CrossRef]
  27. Decoster, I.; Clerbaux, N.; Baudrez, E.; Dewitte, S.; Ipe, A.; Nevens, S.; Blazquez, A.; Cornelis, J. Spectral aging model applied to meteosat first generation visible band. Remote Sens. 2014, 6, 2534–2571. [Google Scholar] [CrossRef]
  28. Sterckx, S.; Adriaensen, S.; Dierckx, W.; Bouvet, M. In-orbit radiometric calibration and stability monitoring of the PROBA-V instrument. Remote Sens. 2016, 8, 546. [Google Scholar] [CrossRef]
  29. Geelen, B.; Blanch, C.; Gonzalez, P.; Tack, N.; Lambrechts, A. A tiny VIS-NIR snapshot multispectral camera. In Proceedings of the Advanced Fabrication Technologies for Micro/Nano Optics and Photonics VIII, San Francisco, CA, USA, 8–11 February 2015; Volume 9374, pp. 194–201. [Google Scholar]
  30. Weidmann, D.; Antonini, K.; Pino, D.; Brodersen, B.; Patel, G.; Hegglin, M.; Sioris, C.; Bell, W.; Miyazaki, K.; Alminde, L.; et al. Cubesats for monitoring atmospheric processes (CubeMAP): A constellation mission to study the middle atmosphere. In Proceedings of the Sensors, Systems, and Next-Generation Satellites XXIV, Online, 21–25 September 2020; Volume 11530, pp. 141–159. [Google Scholar]
  31. Michel, P.; Küppers, M.; Bagatin, A.; Carry, B.; Charnoz, S.; De Leon, J.; Fitzsimmons, A.; Gordo, P.; Green, S.; Hérique, A.; et al. The ESA Hera mission: Detailed characterization of the DART impact outcome and of the binary asteroid (65803) Didymos. Planet. Sci. J. 2022, 3, 160. [Google Scholar] [CrossRef]
  32. Adibekyan, A.; Kononogova, E.; Monte, C.; Hollandt, J. High-accuracy emissivity data on the coatings Nextel 811-21, Herberts 1534, Aeroglaze Z306 and Acktar Fractal Black. Int. J. Thermophys. 2017, 38, 1–14. [Google Scholar] [CrossRef]
  33. Okada, T.; Tanaka, S.; Sakatani, N.; Shimaki, Y.; Arai, T.; Senshu, H.; Demura, H.; Sekiguchi, T.; Kouyama, T.; Kanamaru, M.; et al. Calibration of the Thermal Infrared Imager TIRI onboard Hera. In Proceedings of the European Planetary Science Congress, Granada, Spain, 18–23 September 2022; p. EPSC2022-1191. [Google Scholar]
  34. Schmetz, J.; Pili, P.; Tjemkes, S.; Just, D.; Kerkmann, J.; Rota, S.; Ratier, A. An introduction to Meteosat second generation (MSG). Bull. Am. Meteorol. Soc. 2002, 83, 977–992. [Google Scholar] [CrossRef]
  35. Ouaknine, J.; Viard, T.; Napierala, B.; Foerster, U.; Fray, S.; Hallibert, P.; Durand, Y.; Imperiali, S.; Pelouas, P.; Rodolfo, J.; et al. The FCI on board MTG: Optical design and performances. In Proceedings of the International Conference on Space Optics, Okinawa, Japan, 14–16 November 2017; Volume 10563, pp. 617–625. [Google Scholar]
  36. Tissot, J.; Tinnes, S.; Durand, A.; Minassian, C.; Robert, P.; Vilain, M. High-performance uncooled amorphous silicon VGA and XGA IRFPA with 17 µm pixel-pitch. In Proceedings of the Electro-optical and infrared systems: Technology and applications VII, Toulouse, France, 21–23 September 2010; Volume 7834, pp. 147–154. [Google Scholar]
  37. Arad, B.; Timofte, R.; Yahel, R.; Morag, N.; Bernat, A.; Wu, Y.; Wu, X.; Fan, Z.; Xia, C.; Zhang, F.; et al. NTIRE 2022 spectral demosaicing challenge and data set. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 18–24 June 2022; pp. 882–896. [Google Scholar]
  38. Zeng, H.; Feng, K.; Huang, S.; Cao, J.; Chen, Y.; Zhang, H.; Luong, H.; Philips, W. MSFA-Frequency-Aware Transformer for Hyperspectral Images Demosaicing. arXiv 2023, arXiv:2303.13404. [Google Scholar] [CrossRef]
Figure 1. Multiannual mean Total Outgoing Radiation at 1° resolution. Unit: W/m2. Reproduced from [1].
Figure 1. Multiannual mean Total Outgoing Radiation at 1° resolution. Unit: W/m2. Reproduced from [1].
Remotesensing 15 05487 g001
Figure 2. Multiannual mean Reflected Solar Radiation at 1° resolution. Unit: W/m2. Reproduced from [1].
Figure 2. Multiannual mean Reflected Solar Radiation at 1° resolution. Unit: W/m2. Reproduced from [1].
Remotesensing 15 05487 g002
Figure 3. Multiannual mean Outgoing Longwave Radiation at 1° resolution. Unit: W/m2. Reproduced from [1].
Figure 3. Multiannual mean Outgoing Longwave Radiation at 1° resolution. Unit: W/m2. Reproduced from [1].
Remotesensing 15 05487 g003
Figure 4. Wide-field-of-view reference lens design for the SW camera. Reproduced from [24]. The total axial length equals 85.5 mm. The lens train consists of 3 singlet (s) lenses and an achromatic doublet (d) arranged in an s-s-s-d format. The detector has a size of 4.92 × 4.92 mm. The FOV is 140°. The 12 different colors pertain to 12 different angles between 0° and 70° with a step of 6.36°.
Figure 4. Wide-field-of-view reference lens design for the SW camera. Reproduced from [24]. The total axial length equals 85.5 mm. The lens train consists of 3 singlet (s) lenses and an achromatic doublet (d) arranged in an s-s-s-d format. The detector has a size of 4.92 × 4.92 mm. The FOV is 140°. The 12 different colors pertain to 12 different angles between 0° and 70° with a step of 6.36°.
Remotesensing 15 05487 g004
Figure 5. RGB CMOS spectral responses used in [24].
Figure 5. RGB CMOS spectral responses used in [24].
Remotesensing 15 05487 g005
Figure 6. Realization of the ECO SW camera suite, using 3 separate cameras.
Figure 6. Realization of the ECO SW camera suite, using 3 separate cameras.
Remotesensing 15 05487 g006
Figure 7. Optical design of the SW camera. The total axial length equals 50.6 mm. The lens train consists of 4 singlet (s) lenses and an achromatic doublet (d) arranged in an s-s-s-d-s format. The detector has a size of 16.5 × 16.5 mm. The FOV is 140°. The 12 different colors pertain to 12 different angles between 0° and 70° with a step of 6.36°.
Figure 7. Optical design of the SW camera. The total axial length equals 50.6 mm. The lens train consists of 4 singlet (s) lenses and an achromatic doublet (d) arranged in an s-s-s-d-s format. The detector has a size of 16.5 × 16.5 mm. The FOV is 140°. The 12 different colors pertain to 12 different angles between 0° and 70° with a step of 6.36°.
Remotesensing 15 05487 g007
Figure 8. The RMS error shown in μm versus incidence angle of the light on the first surface of the lens. The colors depict various wavelengths of light (in μm).
Figure 8. The RMS error shown in μm versus incidence angle of the light on the first surface of the lens. The colors depict various wavelengths of light (in μm).
Remotesensing 15 05487 g008
Figure 9. The five different Seidel aberrations and chromatic aberrations. The largest contributor is the barrel distortion.
Figure 9. The five different Seidel aberrations and chromatic aberrations. The largest contributor is the barrel distortion.
Remotesensing 15 05487 g009
Figure 10. The average optical PSF of all simulated wavelengths for the  0  incidence angle. Image size 53.236 by 53.236 μm.
Figure 10. The average optical PSF of all simulated wavelengths for the  0  incidence angle. Image size 53.236 by 53.236 μm.
Remotesensing 15 05487 g010
Figure 11. The active pixel layer of type (left) red/blue pixels and (right) green-green pixels, for a fixed detector area of 53.236 by 53.236 μm, where individual pixels cover 80% of a nominal 5.5 μm square area.
Figure 11. The active pixel layer of type (left) red/blue pixels and (right) green-green pixels, for a fixed detector area of 53.236 by 53.236 μm, where individual pixels cover 80% of a nominal 5.5 μm square area.
Remotesensing 15 05487 g011
Figure 12. The convolution of active pixel layer of type (left) red/blue and (right) green-green with the PSF of  0  incident light, where individual pixels are overlaid over the convolution.
Figure 12. The convolution of active pixel layer of type (left) red/blue and (right) green-green with the PSF of  0  incident light, where individual pixels are overlaid over the convolution.
Remotesensing 15 05487 g012
Figure 13. (Left): The PSF of 400 nm wavelength for  70  incidence angle. Image size 53.236 by 53.236 μm. (Right): The convolution of active pixel layer of blue type with the PSF of  70  incident light of 400 nm wavelength, where individual pixels are overlaid over the convolution.
Figure 13. (Left): The PSF of 400 nm wavelength for  70  incidence angle. Image size 53.236 by 53.236 μm. (Right): The convolution of active pixel layer of blue type with the PSF of  70  incident light of 400 nm wavelength, where individual pixels are overlaid over the convolution.
Remotesensing 15 05487 g013
Figure 14. (Left): The PSF of 500 nm wavelength for  38  incidence angle. Image size 66.545 by 66.545 μm. (Right): The convolution of active pixel layer of green type with the PSF of  38  incident light of 500 nm wavelength, where individual pixels are overlaid over the convolution.
Figure 14. (Left): The PSF of 500 nm wavelength for  38  incidence angle. Image size 66.545 by 66.545 μm. (Right): The convolution of active pixel layer of green type with the PSF of  38  incident light of 500 nm wavelength, where individual pixels are overlaid over the convolution.
Remotesensing 15 05487 g014
Figure 15. Signal-to-Noise Ratio versus wavelength of multispectral cameras for 166 integrations.
Figure 15. Signal-to-Noise Ratio versus wavelength of multispectral cameras for 166 integrations.
Remotesensing 15 05487 g015
Figure 16. Wide-field-of-view reference lens design for the thermal camera. The total axial length equals 86.12 mm. The system consists of 3 singlet lenses. The aperture stop is situated between the first two lenses. The circular image on the detector has a radius of 6.5 mm. The different colors correspond to the different fields between 0° and 70°.
Figure 16. Wide-field-of-view reference lens design for the thermal camera. The total axial length equals 86.12 mm. The system consists of 3 singlet lenses. The aperture stop is situated between the first two lenses. The circular image on the detector has a radius of 6.5 mm. The different colors correspond to the different fields between 0° and 70°.
Remotesensing 15 05487 g016
Figure 17. Longwave spectral responses.
Figure 17. Longwave spectral responses.
Remotesensing 15 05487 g017
Table 1. Lens data: surface type, material, thickness, and diameters. Three aspherical surfaces. s-s-s-d-s format.
Table 1. Lens data: surface type, material, thickness, and diameters. Three aspherical surfaces. s-s-s-d-s format.
Lens Order (Singlet/Doublet)Front Surface TypeRear Surface TypeMaterialThickness (mm)Diameter (mm)
First lens (s: singlet)AsphericalSphericalLAK143.732
Second lens (s)SphericalSphericalLAK14523
Third lens (s)SphericalSphericalSF6512
Fourth lens (d: doublet)SphericalSphericalN-FK51A2.77.6
Fifth lens (d)SphericalAsphericalN-SF627.6
Sixth lens (s)SphericalAsphericalLAK142.211.4
Table 2. ECO longwave spectral bands. The limits are defined at the FWHM (Full Width at Half Maximum) of the spectral bands.
Table 2. ECO longwave spectral bands. The limits are defined at the FWHM (Full Width at Half Maximum) of the spectral bands.
BandLimits
LW-18–9 μm
LW-29–10 μm
LW-310–11 μm
LW-411–12 μm
LW-512–13 μm
LW-613–14 μm
Table 3. Longwave filter fraction and NEDT.
Table 3. Longwave filter fraction and NEDT.
Band 1Band 2Band 3Band 4Band 5Band 6
Fraction of black body radiation0.10870.11550.10830.09080.06430.0362
NEDT per band (mK)51.5550.0351.6656.4067.0289.3785
Table 4. Longwave reference scenes, OLR, and OLR regression error.
Table 4. Longwave reference scenes, OLR, and OLR regression error.
SceneOLR (W/m2)OLR Error (%)
US standard—clear sky257.590.21
Tropical—clear sky284.46 0.44
Midlatitude summer—clear sky277.87 0.25
Midlatitude winter—clear sky227.95 0.46
Subarctic summer—clear sky260.81 0.11
Subarctic winter—clear sky197.04 0.17
US standard—water cloud214.310.61
US standard—thin ice cloud184.320.04
US standard—thick ice cloud124.75 0.61
Midlatitude winter—water cloud200.64 0.52
Midlatitude winter—thin ice cloud171.44 0.49
Midlatitude winter—thick ice cloud125.250.23
Subarctic summer—water cloud227.57 0.42
Subarctic summer—thin ice cloud196.57 0.07
Subarctic summer—thick ice cloud142.98 0.08
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Dewitte, S.; Abdul Nazar, A.A.; Zhang, Y.; Smeesters, L. A Multispectral Camera Suite for the Observation of Earth’s Outgoing Radiative Energy. Remote Sens. 2023, 15, 5487. https://doi.org/10.3390/rs15235487

AMA Style

Dewitte S, Abdul Nazar AA, Zhang Y, Smeesters L. A Multispectral Camera Suite for the Observation of Earth’s Outgoing Radiative Energy. Remote Sensing. 2023; 15(23):5487. https://doi.org/10.3390/rs15235487

Chicago/Turabian Style

Dewitte, Steven, Al Ameen Abdul Nazar, Yuan Zhang, and Lien Smeesters. 2023. "A Multispectral Camera Suite for the Observation of Earth’s Outgoing Radiative Energy" Remote Sensing 15, no. 23: 5487. https://doi.org/10.3390/rs15235487

APA Style

Dewitte, S., Abdul Nazar, A. A., Zhang, Y., & Smeesters, L. (2023). A Multispectral Camera Suite for the Observation of Earth’s Outgoing Radiative Energy. Remote Sensing, 15(23), 5487. https://doi.org/10.3390/rs15235487

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop