1. Introduction
Remote-sensing observations of the North and South Poles are critical for understanding ecosystem dynamics and the evolution of the polar environment. Polar regions are key components of Earth’s climate system, and changes in their ice caps, glaciers, and sea ice directly affect global sea levels and climate patterns [
1,
2]. Owing to the extreme harshness of polar environments and the high cost and risk of fieldwork, remote-sensing technology has become an irreplaceable means of obtaining large-scale and continuous observation data [
2,
3,
4,
5]. This approach provides a scientific basis for predicting future climate trends and formulating coping strategies through the real-time monitoring of key indicators, such as ice sheet melting, changes in the extent of sea ice, and the rise in polar temperatures.
Polar remote-sensing techniques are evolving to obtain a better understanding of polar regions. Macdonald et al. presented a new design concept for space-based polar remote sensing, emphasising the need for improved observation techniques [
6]. Mineart et al. discussed the importance of the Joint Polar Satellite System for seawater-leaving radiometric measurements and demonstrated the importance of remotely sensed data in ocean and coastal observations [
7]. The European Space Agency has proposed the G-TERN mission, the core of which is the realisation of high temporal and spatial resolution observations of polar sea ice, oceans, and atmospheres by a single near-polar-orbiting satellite using the Global Navigation Satellite System Reflectometry (GNSS-R) technology, which is scheduled to be conducted between 2025 and 2030 [
8]. Han et al. focused on the image motion of wide-field remote-sensing cameras in polar regions, with theoretical modelling based on surface-array camera image instability [
9]. Rocchini et al. presented a methodology for estimating the diversity of ecosystems using high-resolution remotely sensed data, emphasising the identification of spectral species and the ability to map α and β diversity over a wide geographical area [
10]. In the context of polar regions, Li emphasised the importance of remotely sensed big data in marine and polar studies and demonstrated the importance of using remote-sensing techniques in these regions [
11]. Advances in modelling techniques are essential for deepening our understanding of polar environments and their impact on global systems [
12].
Image quality is mainly evaluated by analysing the modulation transfer function (MTF) of an image. MTF is a key parameter for characterising the quality of remote-sensing satellite images and represents the ability of optical payloads to produce “sharp” images [
13,
14]. Ardanuy et al. emphasised the trade-off between image quality and sensitivity in remote-sensing systems and the importance of the MTF and signal-to-noise ratio (SNR) in system performance optimisation [
15]. Mary et al. reviewed various methods for evaluating spatial resolution parameters during flight, including the MTF [
16]. Han-Zhou et al. established a mathematical model of the dynamic MTF for a remote-sensing camera, considering the influence of the line transmission cycle error [
17]. When analysing the in-orbit imaging of space cameras, Wu and Li Y designed a matching strategy between integral series and yaw angle with a constraint of a 5% decrease in modulation transfer function (MTF) [
18,
19]. Dubey et al. proposed a scheme to improve the contrast of remote-sensing images, emphasising the importance of MTF [
20]. Accurate in-flight measurements of the MTF of ALSAT-2A images were discussed, describing the sensor’s ability to resolve the spatial details of images formed from incident optical information. It has been shown that the ALSAT-2A system is well stabilised with a better-than-expected evolution of the MTF [
21].
Motion compensation in remote-sensing camera images is critical for ensuring high-quality image capture and processing. Walter and Schneker explored the application of micromechanical devices for motion compensation in satellite-borne charge-coupled device (CCD) imaging systems. This method aims to minimise image blurring caused by the relative motion between the scene and imaging system [
22]. Sorel et al. discussed recent advances in spatial-variant deblurring and image stabilisation techniques, highlighting the importance of image-processing methods for improving image quality under low-ambient-light conditions [
23]. Wang et al. proposed a fine image-motion-compensation method for panoramic time-delayed integration (TDI)-CCD cameras, specifically for remote-sensing applications. This method focuses on improving image quality by compensating for motion blur [
24]. Heo et al. proposed a technique to reduce motion blur by adjusting the integration time of the scanning camera of a TDI-CMOS sensor, emphasising the importance of optimising the camera settings to minimise motion blur [
25]. Zhao et al. proposed a deep-learning-based super-resolution method for remote-sensing satellite videos utilising explicit motion-compensation methods through optical flow estimation and warping operations [
26]. Gao et al. introduced a motion-compensation method based on multi-actuator control and mode switching to solve the scanning imaging motion-compensation problem [
27].
In summary, the literature on the motion compensation of remote-sensing images covers a wide range of innovative methods and techniques to address the various challenges of image-motion compensation in remote-sensing applications, including optical satellite imaging, synthetic aperture radar (SAR) imaging, and video super-resolution. However, there is insufficient research on motion compensation for polar remote-sensing imaging. This paper analyses the heterometric speed field by constructing a speed field model and establishes a quantitative relationship model between TDI imaging quality and heterometric effects based on this, providing more theoretical basis for polar remote sensing.
Unlike surface-array cameras, TDI cameras have a small field of view in the push-scan direction and a large field of view in the line array direction (“push-scan direction” refers to “along-track direction” and “line array direction” refers to “across-track direction”), and the TDI detector can be equipped with a slice-type adjustment of the line frequency, which is more advantageous for image-motion correction. By optimising the synchronisation between line scan speed and exposure time, it enhances signal accumulation while minimising saturation or noise. Additionally, the TDI detector features a simpler pixel structure with better inter-pixel response uniformity compared to area-array cameras, contributing to higher accuracy in radiometric calibration. To ensure high-quality polar imaging under the large-field-of-view scanning mode, the research object of this study was the scanning imaging of high-resolution space cameras. We established a velocity-field model of the polar conditions of a sun-synchronous orbit satellite camera and constructed a mapping relationship between the object image and the target image point’s motion velocity at the focal plane in the dynamic scanning of the polar region as well as a degradation model of the dynamic transfer function under the TDI mode. On this basis, the travel frequency and yaw angle were adjusted to compensate for the image motion, the TDI scanning camera polar imaging was simulated and analysed by MTF, and the correctness of the theory was verified by on-orbit images.
2. Polar Scanning Imaging Model
Polar exploration satellites generally operate in nearby orbits, with an orbital inclination of 90°. However, considering the allocation of resources and remote sensing of Earth, sun-synchronous orbit satellites are generally used to carry out remote-sensing observations of polar regions. When passing near the North and South Poles, the camera attitude and imaging parameters are planned in advance to complete polar remote-sensing imaging.
Polar remote-sensing detection imaging, as shown in
Figure 1, shows that the image of different target points shifts velocities according to the time-change law and is not constant. Because of the ground target velocity-field anisotropy on the image surface, a detailed design and analysis of the imaging model is required.
2.1. Polar Image-Motion Velocity-Field Modelling
To visualise the calculation and analysis of the velocity-field model, we established a velocity vector model at the objective point on the surface of the object. First, the geocentric coordinate system OXYZ was established, with O as the centre, Z pointing to the North Pole, the X-axis pointing to the vernal equinox, and the Y-axis orthogonal to the other two axes in accordance with the right-hand rule. is the inclination angle of the track and is the sub-angle of the ascending intersection point.
The star–Earth relationship—as a satellite moves from a low to a high-latitude region—is shown in
Figure 2. When the satellite in a sun-synchronous orbit captures images on the polar target side pendulum, with the star under the point for the S point and side-pendulum angle
, the camera view axis points to the target point S, and the target point along the longitude is plumb to the equatorial plane intersecting it at point D. Angle
is the latitude of the view axis, and angle
is the view axis corresponding to the geocentric angle. The target point outside the view axis corresponds to a different field-of-view angle α. The geocentric angle corresponds to the target point and the visual axis is
.
The geocentric angle corresponding to the target point after the camera’s side-swing angle
is determined by spherical trigonometric relations:
R is the Earth’s radius, H is the satellite’s orbital altitude, and α is the field-of-view angle corresponding to the target point.
The latitude of the target point is
The angle
ε is defined as the angle between the following two vectors: (1) The perpendicular line from the subsatellite point to the equatorial plane. (2) The connecting line between the subsatellite point and the ascending node. Its mathematical expression is given by:
This yields the satellite and ground velocities at the target point:
The forward (
) and lateral (
) image-motion velocities of the target point on the camera focal plane are expressed as:
is the camera focal length.
The planar line frequency is
and the yaw angle is
In this velocity-field model, the partial velocity of the target point in the image plane can be calculated for satellites with different latitudes and inclinations, so the model is still valid for other orbital configurations.
2.2. Calculation of the Transfer Function for TDI Camera Scanning Imaging
In push-scan imaging, considering the push-scan speed and spatial resolution, the line frequency can be obtained under different parameter conditions. The maximum exposure time of the push-scan camera is constrained by the line period. Unlike TDI-CCD, the exposure time of a digital-domain TDI-CMOS imaging system can be set independent of the line period. However, in the actual on-orbit imaging process, owing to multiple factors such as orbit altitude error and the delayed sending of attitude data, a certain error exists between the actual required line frequency and the output line frequency of the camera, which leads to a difference in the target of the pixel exposure imaging of different grades and affects the quality of the on-orbit imaging.
At the integration time, the diffusion function was derived from the superposition of the diffusion function at different moments. When there was an image shift, the width of the diffuse spot increased, the peak energy decreased, and the image became blurred. In addition, if the image shift is irregular, the position of the diffuse spot also changes, leading to a geometric distortion of the image, as shown in
Figure 3. Therefore, to accurately analyse the on-orbit imaging quality of a TDI-CMOS imaging system in the digital domain, it is necessary to construct a dynamic-imaging shift–transfer function model.
The energy distribution during the single-stage exposure time was integrated and normalised to obtain the x-direction line diffusion function affected by the single-stage image shift in the system:
where
is the integration time,
is the image-shift Dirac function,
X is the coordinate of the absolute position of the pixel on the image plane,
x(
t) is the displacement of the image point during the exposure time in the push-scan direction, and
y(
t) perpendicular to the push-scan direction is the absolute image shift in that direction.
The line diffusion function after multilevel integration is
Equation (14) is now the optical transfer function of the diffusion function:
A multilevel expansion of the optical transfer function yields the system MTF:
The dynamic transfer function of the camera in orbit is primarily determined by the static transfer function of the camera, target contrast, large-scale perturbation, and other factors. The imaging process of the image shift on the dynamic transfer function of the digital-domain TDI remote-sensing camera imaging also has a significant impact, particularly in multilevel integral imaging conditions. The study of the transfer function of the target, atmosphere, and optical system is relatively mature, so the main focus of this study was the analysis of the effect of image shift on the dynamic transfer function in this imaging mode and the establishment of the relationship between the integration level and the imaging quality. In this study, the effect on the transfer function was defined as a decrease in the MTF.
3. Simulation Analysis
To achieve the requirement of ground pixel resolution, the focal length of the high-resolution space camera is quite long; to ensure the field-of-view and the ground coverage width, the focal plane is mostly made of multiple detectors spliced on the image plane substrate. During the photography process, the space camera controller adjusts the detector-row transfer period to match the calculated image-shift velocity and performs electronic image-shift compensation. The row period of each CCD can be adjusted uniformly or separately, whereas the bias angle is adjusted uniformly by the satellite attitude control system and image plane bias-angle adjustment mechanism. When imaging the high-latitude region, the size and direction of the ground velocity component projected on different CCDS are different, and the velocity-field distribution at different target points is shown in
Figure 4.
The stereoscopic mapping satellite adopts a sun-synchronous circular orbit with an orbital altitude of 500 km and a high-resolution camera with a field-of-view of 22° and a focal length f = 5 m. The camera simulation parameters are summarized in
Table 1:
Figure 5 shows the variation curve of Earth’s rotational velocity with
under different sub-angle conditions for different fields of view at a lateral swing angle of 45°.
As shown in
Figure 5, as Earth’s rotational velocity changes, little difference is observed between the target ground speeds of different fields of view at low latitudes, but an obvious difference exists between the target ground speeds of different fields of view at high latitudes. As shown in
Figure 6, the travelling frequency differs for different fields of view: the travelling frequency changes significantly in the stage of entering the high-latitude region and reaches the maximum value when the amplitude angle is 90°. At this point, the image-shift speed of the various positions of the field of view is the smallest.
Using the image-shift model, we analysed the influence of different dimensions of Earth’s rotation on the velocity field of the image shift under the condition of large sideways oscillations;
Figure 7 shows a schematic of the rotational velocity field.
As shown in
Figure 7, when the side pendulum is photographed at 45°, the rate of change in the rotation vectors in different fields of view accelerates as the dimension increases.
Figure 8 shows the variation curve of the bias flow angle with angle β between the substellar point and an ascending node at a side-pendulum angle of 45°.
An analysis of the drift-angle variation trend reveals that the edge FOV and centre FOV of the camera exhibit negligible drift-angle differences in low-latitude regions, while a 0.03° discrepancy is observed in high-latitude areas.
Using the large field-of-view space camera side-swing photography image-shift calculation method, the unified adjustment of the line frequency and bias current angle on the MTF during side-swing photography was analysed; the theoretical analysis results are shown in
Figure 8. According to the imaging transfer function model discussed in
Section 2, the image degradation of the TDI camera was unified and piecewise-adjusted line frequency modes in the imaging scanning process were quantitatively analysed, and the imaging transfer function was obtained under the conditions of different integration levels in the field-of-view as follows:
Under large wide-imaging conditions, where the middle field-of-view image quality requirements are relatively high, the attitude yaw angle can be selected for intermediate-slice yaw-angle correction. At this point, the edge of the field-of-view MTF degrades, as shown in
Figure 9.
As shown in
Figure 10, the correct matching of the charge transfer speed and sensor scanning speed after the split adjustment of the line frequency significantly improves the image quality of the TDI imaging system. After yawing according to the middle-field-of-view yaw angle, the edge-field-of-view transfer function decreases owing to the different yaw angles between the middle- and edge-field-of-view angles. Simultaneously, with an increase in the number of stages of integration, the amount of image shift increases, resulting in a decrease in the transfer function. When the satellite conducts imaging at high latitudes with a 45° side swing, the transfer function of the edge field of view after 196 levels of integration decreases by 0.8%. When there is no side-swing imaging, the transfer function of the edge field of view after 196 levels of integration decreases by 1.7%. This is primarily because, after the side swing, the closer the view axis of the satellite camera is to the pole point, the smaller the ground speed, and the non-normal image shift caused by this condition is quite small, resulting in a smaller impact on the transfer function. To ensure the quality of the on-orbit dynamic imaging, the number of integration levels within 196 meets the requirements if the transfer function must decrease to below 5%.
During the satellite in-orbit operation, whether in imaging mode or standby mode, the potential impact of sunlight on the optical camera should be considered. Therefore, the satellite studied in this paper adopts the solar synchronous orbit design, its transit time is fixed, and the solar azimuth can be accurately predicted, which is convenient for imaging planning combined with the preset solar evasion angle constraint during the side-sway manoeuvre.
For the sub-satellite-point imaging scene, if the camera field-of-view angle is ±30° and the yaw-angle difference is 0.1°, in order to ensure that the MTF decline does not exceed 5%, as shown in
Figure 11, the integration level should be limited to below 128 levels; then, the system performance can meet the task requirements.
4. On-Orbit Validation
In this study, a high-resolution wide-format camera on a remote-sensing satellite operating in a sun-synchronous orbit with an orbital altitude of 500 km and orbital inclination of 97.4° was used as an example. It adopted the push-scan imaging mode with a panchromatic resolution of 0.75 m and a multispectral resolution of 3 m. To monitor changes in polar glaciers, images were acquired at high-latitude overtopping times.
At the time of imaging, the solar altitude angle was 23.5461°, the lateral oscillation angle was −0.51°, the cloudiness was 0%, and the optical axis pointed to geographic coordinates (82.03°, −56.03°). This validation used the centre slice yaw because the camera side-swing angle was negligible. The left-edge, centre, and right-edge slice detectors of the camera were selected to compare the imaged scenes. The maximum spacing between the left and right scenes was 121.9 km. Because of the relative simplicity of the scenery in the polar regions, to better compare the quality of the image at different fields of view, the local zoom-in map was selected for the coastal region, as shown in
Figure 12.
From the selected three-scene images, to better observe the geomorphic features, we selected representative local areas for comparison, as shown in
Figure 13.
An analysis of the images of each fringe slice showed no obvious degradation in the imaging quality of each scene. Compared with surface-array large-width remote-sensing satellite images, the texture of the scenes imaged by the TDI camera was clearer, indicating that the high-resolution wide-width TDI camera of this remote-sensing satellite could capture the subtle deformations of the ground surface under high-latitude observation conditions, effectively guaranteeing imaging quality and ensuring the accurate identification of remote-sensing images to satisfy the need for monitoring changes in polar glaciers.
5. Conclusions
In this study, the effect of the anisotropy of image-shift velocity on imaging quality was investigated by imaging high-latitude polar observations using a large-field-of-view space camera. A mathematical model of the image-shift velocity field was established for the high-latitude region, and analysis results show that after correcting the bias angle, there still exists uncorrectable bias-angle residuals in the edge-field-of-view because of the image-shift velocity anisotropy. On this basis, a dynamic-imaging image-shift-transfer function model was constructed, and the MTF was adopted as the basis for determining the image quality.
High-latitude images were analysed to determine the influence of the MTF owing to image-shift anisotropy. When imaging at high latitudes, the edge-deflection-angle residuals led to a 2% transfer function degradation after 196-level integral imaging, with no obvious degradation in image quality. To verify this result further, high-latitude images from a certain remote-sensing wide-format camera were selected for comparison. Both the edge and intermediate field-of-view images showed good imaging results. The findings of this study provide a theoretical reference for the current large-field-of-view space cameras to obtain high-latitude target information for the edge fuzzy degradation problem.