Next Article in Journal
An Efficient Detection Framework for Aerial Imagery Based on Uniform Slicing Window
Previous Article in Journal
Quantitative Characterization of Coastal Cliff Retreat and Landslide Processes at Portonovo–Trave Cliffs (Conero, Ancona, Italy) Using Multi-Source Remote Sensing Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

A Remote Sensing Image Quality Interpretation Scale Characterization Method Based on the TTP Criterion

School of Optoelectronic Engineering, Xidian University, Xi’an 710071, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2023, 15(17), 4121; https://doi.org/10.3390/rs15174121
Submission received: 13 June 2023 / Revised: 20 August 2023 / Accepted: 21 August 2023 / Published: 22 August 2023

Abstract

:
Accurate grading of remote sensing image interpretation is crucial for improving image classification and screening efficiency. Through extensive research, the General Image Quality Equation (GIQE) based on the National Imagery Interpretability Rating Scale (NIIRS) has been developed. However, poor reliability and low accuracy issues remain due to the failure to consider human visual characteristics. This paper introduces the Target Task Performance (TTP) criterion as a key parameter to reflect the cascading degradation factors of human visual perception characteristics and system imaging links, which improves the reliability of the model. A New Optimized Remote Sensing Image Quality Equation (NORSIQE), which effectively predicted the interpretability of image information, is constructed. Using 200 sets of test data, the quantitative relationship between key parameters (GSD, TTP, and SNR) of the NORSIQE and the subjective NIIRS level is obtained by a least squares regression fit, and the determination coefficient of the model is as high as 0.916. The model is evaluated for accuracy using 120 sets of validation data, showing an 87% improvement compared to the GIQE4. This method provides theoretical support for the development of new methods for remote sensing image quality evaluation and the design of payloads for remote sensing imaging systems.

Graphical Abstract

1. Introduction

With the continuous updating of remote sensing technology, the number of satellites has increased and the ability of data acquisition has been greatly improved. However, in the face of massive image data, it is necessary to evaluate image quality and interpretability in order to effectively improve the screening efficiency. Therefore, it is of great practical significance to study the evaluation method of the remote sensing image quality interpretation scale [1,2,3,4,5,6,7,8,9,10,11,12,13,14].
The National Imagery Interpretability Rating Scale (NIIRS) [15,16,17,18,19] is the most commonly used international subjective evaluation guideline for image quality. It is a task-based scale that classifies image quality into 10 levels from 0 to 9, with the degree of interpretation increasing step by step. On the other hand, the General Image Quality Equation (GIQE) [19,20,21,22,23,24,25,26,27,28] is a quality assessment model developed for the NIIRS. It uses the imaging parameters of the image itself to assess the NIIRS of the image.
There are three known versions of the GIQE: GIQE3, GIQE4 and GIQE5. The latest version, the GIQE5, was released in 2015 by the National Geospatial-Intelligence Agency (NGA) through a paper on its website [25]. However, because it only calculates a single NIIRS level for “well-enhanced” images as a function of their unenhanced parameters without considering processing details and cannot be used to predict NIIRS, the GIQE4 is still widely used [28], and the model in this paper is optimized based on the GIQE4.
The GIQE4 quantifies the degree of degradation of the image in terms of spatial resolution and sharpness, forming a mapping relationship between image imaging parameters and NIIRS grade. However, its failure to characterize the human eye’s ability to perceive image information in prediction calculations makes the model’s prediction results largely unreliable and unconvincing, as well as having the problem of low accuracy.
Researchers from different countries have conducted studies on image quality interpretation from various perspectives. For instance, Valenzuela et al. [28] conduct a comparative analysis of different versions of the GIQE, while Jingbo Bai et al. [29] study image NIIRS and GIQE for Unmanned Aerial Vehicle (UAV) intelligence and reconnaissance actions. Lin L et al. [30] establish a regression model of manual interpretation results and model prediction results, considering that the ground spatial resolution of the ZY-3 satellite is not within the applicable range of parameters of the GIQE model. Pengfei Zhao [31] improves the GIQE model by using three different Modulation Transfer Function (MTF) performance parameters.
The above-mentioned studies conducted different analytical investigations into the GIQE and optimized image quality evaluation methods from certain perspectives, but none of them combined the human eye’s ability to perceive image information with the prediction calculation of the subjective NIIRS level of experts’ judgments (abbreviated in later papers as subjective NIIRS level), resulting in prediction models that still suffer from poor reliability and low accuracy.
In response to this issue, this paper analyzes the correlation between the target task performance (TTP) criterion [32,33,34,35,36,37] and remote sensing image quality interpretation, takes the TTP criterion as a key parameter in the prediction model of the remote sensing image quality interpretation scale, and constructs a New Remote Sensing Image Quality Equation (NRSIQE). This equation can reflect the characteristics of the human visual system’s perception of image information in the prediction. On completion of regression fits and precision analysis, a New Optimized Remote Sensing Image Quality Equation (NORSIQE) is constructed by choosing a logarithmic function to represent the signal-to-noise ratio term. Compared with the GIQE4, the prediction accuracy is improved by 87%. The verification results show that this method can effectively improve the prediction reliability and accuracy of the remote sensing image quality interpretation model.

2. Deficiencies of the GIQE

The GIQE is an image quality assessment model that was developed for the NIIRS. It utilizes the system imaging parameters of the image, which are objective evaluation metrics, to evaluate the NIIRS, which is a subjective evaluation metric. The principle of the GIQE is illustrated in Figure 1. And the specific calculation formula of the GIQE4, which is widely used, is as follows:
NIIRS = 10.251 a lg ( GSD ) + b lg ( RER ) 0.656 H 0.334 ( G SNR )
where lg(x) = base 10 logarithmic function of x; GSD = Ground Sampling Distance (in inches); RER = Relative Edge Response; H = Edge response overshoot caused by Modulation Transfer Function Compensation (MTFC); G = sharpening filter gain; SNR = signal-to-noise ratio; a and b are constants, a = 3.32 and b = 1.559 when RER < 0.9, a = 3.16, and b = 2.817 when RER < 0.9.
As in Equation (1), the GIQE4 predicts the image quality interpretation scale from the spatial resolution, sharpness and noise of the image, which only takes into account the objective imaging quality parameters of the image and focuses on the influence of spatial resolution degradation on the image quality and interpretability. However, as a specific behavior driven by human subjective consciousness, the results of image interpretability [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19] evaluation will be affected by the perceptual characteristics of human eyes on the information contained in the image. The GIQE4 does not takes into account the ability of the human visual system to perceive image information, so it cannot accurately predict the human understanding of the information contained in the image and the judgment of the image interpretation task, which leads to the lack of reliability and accuracy of the prediction model.
To overcome the problem that the subjective visual perception factors of the observer are not reasonably considered in the GIQE4 evaluation, and then to achieve the purpose of improving the accuracy and completeness of image quality interpretation scale prediction, in this paper, we propose to introduce the TTP criterion into the remote sensing image quality equation, to reflect the effect of the perceptual characteristics of human visual image information on the human judgment of image interpretation tasks in the prediction model.

3. Theoretical Basis for the Introduction of the TTP Criterion

The quality and interpretability scale of remote sensing images are closely related to the objective image quality and human perception. In order to effectively improve the prediction accuracy of the model, it is necessary to evaluate the image quality and combine the understanding and perception characteristics of the human visual system with the information contained in the image.
The TTP criterion is a comprehensive performance evaluation index defined by the Night Vision and Electronic Sensors Director (NVESD) to describe the effectiveness of human eye-space integration in the effective spectrum [32,33]. Figure 2 illustrates the schematic diagram of the TTP criterion, where the intersection point of the imaging contrast function C t g t MTF s y s ( ξ ) (where C t g t represents the target modulation contrast, MTF s y s ( ξ ) represents the Modulation Transfer Function of the system and ξ represents the spatial frequency) and the Contrast Threshold Function of the human eye ( CTF e y e ( ξ e y e ) ) is ξ h . This represents the limit resolution of the image interpretation task, where only the part where C t g t MTF s y s ( ξ ) is greater than CTF e y e ( ξ e y e ) can be perceived by human vision. All the information in the effective spectrum range can be integrated to obtain all the image information perceived by the human eye, which characterizes the spatial integration effect of the human eye. The calculation expression is:
TTP = ξ l ξ h C t g t MTF s y s ( ξ ) CTF e y e ( ξ e y e ) d ξ
MTF s y s ( ξ ) = MTF o p t ( ξ ) · MTF det ( ξ ) · MTF e ( ξ ) · MTF d i s ( ξ )
CTF e y e ( ξ e y e ) = a b ξ e c ξ e y e 1 + 0.06 e c ξ e y e
a = 1 + 12 ω ( 1 + ξ e y e 3 )
b = 540 ( 1 + 0.7 L ) 0.2
c = 0.3 ( 1 + 100 L ) 0.15
where ξ l = integration start frequency; ξ h = integration cutoff frequency; MTF o p t ( ξ ) = the MTF of the optical system; MTF det ( ξ ) = the MTF of the detector; MTF e ( ξ ) = the MTF of the electronic circuit; MTF d i s ( ξ ) = the MTF of the display; Ref. [37] ξ e y e = ξ / SMAG represents the spatial frequency of the human eye; SMAG = the system angular magnification; ω = the square root of the graphic area expressed in degrees; L = the brightness of the display in cd/m2 [38].
Both are subject to the condition that C t g t MTF s y s ( ξ ) is greater than CTF e y e ( ξ e y e ) . It is the contrast threshold of image information that the human eye can perceive, which is affected by spatial resolution, image area, display average brightness and system magnification. And it reflects the ability of human vision to perceive image information in different display states.
It can be seen that the TTP criterion is obtained by integrating C t g t MTF s y s ( ξ ) and CTF e y e ( ξ e y e ) in effective spatial frequency after comparing mappings. It can not only reflect the impact of imaging system performance, the cascade diffusion phenomenon of the remote sensing optical imaging system, the target background contrast, the overall image quality characteristics, the display brightness adjustment and other factors, but also characterize how human eyes perceive the effective information of remote sensing images. It is a comprehensive representation index of objective image quality and subjective human perception characteristics.
In view of this, the TTP criterion is introduced into the remote sensing image quality interpretation model, which can provide theoretical support to improve the reliability and accuracy of the remote sensing image interpretability prediction model.

4. New Quality Equation for Remote Sensing Images

The image quality equation is a bridge between the subjective and objective evaluation of images [12,13,14]. In this paper, the TTP criterion, as a comprehensive evaluation index, is proposed as a parameter in the new equation for remote sensing image quality. Thus, the objective and subjective evaluation factors can be effectively combined to improve the reliability and accuracy of predictions [32,33,34,35,36].

4.1. Construction of the New Model

In the GIQE4, RER represents the diffusion degree of image Edge Spread Function (ESF) and the grayscale diffusion range of the image reflects the degradation of image edge sharpness, but it is not enough to represent the overall image quality. In contrast, MTF s y s ( ξ ) covered by the TTP criterion can measure the diffusion phenomenon and image clarity in remote sensing optical imaging systems and can reflect the on-orbit imaging performance of sensors and the overall image quality more intuitively than RER [11]. And CTF e y e ( ξ e y e ) is the contrast threshold of image information that the human eye can perceive, which is affected by spatial resolution, image area, display average brightness and system magnification. And it reflects the ability of human vision to perceive image information in different display states [37]. In addition, the TTP criterion [32,33,34,35,36] is directly used as the independent variable of the remote sensing image quality equation, which has characterized the role of MTFC to a certain extent. Meanwhile, due to the close correlation between edge overshoot H, noise gain G and modulation transfer function compensation (MTFC), this paper directly extracts the MTF of the image, which has to some extent taken into account the influence of H and G. Therefore, H and G caused by MTFC are no longer used as independent variables in image quality equations, which can reduce the complexity of the model.
Meanwhile, the TTP criterion considers all spatial frequency information that can be perceived by human eyes on the display, reflecting the spatial integral effect of the human vision system. Therefore, the TTP criterion is introduced into the image quality interpretation scale prediction model, which can realize the precision interpretation of conventional aperiodic remote sensing image quality.
Furthermore, the quality and interpretability scale of remote sensing images are closely related to the detail definition of the images. While spatial resolution reflects the detailed information contained in an image, noise has a significant impact on image sharpness and contrast. Therefore, GSD and SNR continue to be regarded as the key parameters in the NRSIQE in the scale estimation of remote sensing image quality.
In view of the above analysis, the specific expression of the NRSIQE is shown in Equation (8).
NIIRS = c 1 + c 2   lg   ( GSD ) + c 3   lg   ( TTP ) + c 4 SNR
where lg(x) = base 10 logarithmic function of x; GSD = Ground Sampling Distance (in inches); TTP = information acquisition performance based on the integration of image contrast and human eye contrast threshold transfer function in the frequency domain; SNR = signal-to-noise ratio; c1, c2, c3 and c4 = undetermined coefficients.
In order to determine the quantitative relationship between the key parameters (GSD, TTP and SNR) in the NRSIQE and the subjective NIIRS level, the imaging parameters of 200 groups of remote sensing image slices from three different sensors and different times and places, and the corresponding subjective NIIRS level (which is evaluated by more than five experts before the median level is determined) are used as the test data set. Table 1 shows part of the image information in the test set, and the sample of remote sensing image slices is shown in Figure 3.
Based on the sample data corresponding to the test data set, the regression model corresponding to the NRSIQE is established by using the least squares method, and the fitted results obtained are shown in Figure 4.
In the above figure, the blue solid line represents the prediction result of the regression-fitted model corresponding to the NRSIQE, and the orange dotted line represents the subjective NIIRS level. Evidently, the predicted value of the NRSIQE obtained by fitting is in good agreement with the subjective NIIRS level, and the corresponding model expression is:
NIIRS = 3 . 3341 2 . 729   lg   ( GSD ) + 1 . 3091   lg   ( TTP )   0 . 2741 SNR
The determination coefficient of the model is R 2 = 0.421 , and the value p = 9.67 × 10−73 is rejected, which proves that the fit of the equation is good and the regression coefficient is significant.
In order to further improve the reliability of the regression model, consider improving the stability of the data without changing the nature and relationship of the data. On the basis of Equation (8), the logarithmic function is chosen to express the SNR term, which not only improves the stability of the data, but also reduces the redundancy of the model. A NORSIQE with a unified form of operation is built. The specific expression is:
NIIRS = c 1 + c 2   lg   ( GSD ) + c 3   lg   ( TTP ) + c 4   lg   ( SNR )  
The meaning of each parameter in the above formula is the same as that in Equation (8).
Similarly, in order to clarify the values of c1, c2, c3 and c4, the regression model corresponding to the NORSIQE was fitted with the same method, and the expression is as follows:
NIIRS = 4 . 6366 3 . 2058   lg   ( GSD ) + 0 . 5996   lg   ( TTP ) + 3 . 5611   lg   ( SNR )  
The determination coefficient of the model is R 2 = 0.916 , and the value p = 6.54 × 10−106 is rejected.
The coefficients of the NRSIQE and NORSIQE and the corresponding regression analysis indicators were compared and analyzed, and the results are shown in Table 2.
By comparing the determination coefficient ( R 2 ) and Root Mean Square Error (RMSE) of the two regression models, it can be seen that the NORSIQE has higher regression significance and reliability, and its prediction results can form a good consistent correlation with the subjective NIIRS level corresponding to the image samples in the remote sensing image test data set.

4.2. Verification of the New Model

In order to verify the prediction accuracy and validity of the NORSIQE, the imaging parameters of another 120 sets of remote sensing image slices from four different sensors and different times and places, and their corresponding subjective NIIRS level were used as validation data set. The absolute value of the difference between the predicted results of the GIQE4, NRSIQE, and NORSIQE and the subjective NIIRS level, is shown in Equation (12), expressed as ∆NIIRS.
Δ NIIRS = NIIRS p r e d i c t i o n NIIRS s u b j e c t i v e
where NIIRS p r e d i c t i o n = the NIIRS level predicted by each model (the GIQE4, NRSIQE, and NORSIQE), NIIRS s u b j e c t i v e = the subjective NIIRS level.
The comparative analysis results of prediction accuracy obtained are shown in Figure 5.
According to ∆NIIRS corresponding to the three models in Figure 5a, it can be seen that the prediction result of the GIQE4 has the highest error value with respect to the subjective NIIRS level, and has no stability. The NRSIQE prediction results have the most stable error value with respect to the subjective NIIRS level. The NORSIQE’s prediction results have the least deviation from the subjective NIIRS level. According to Figure 5b, it is more intuitively reflected that the NORSIQE has a higher prediction accuracy compared with the NRSIQE.
On the basis of the corresponding interpretation task of this study, the data samples with the subjective NIIRS level of 4, 5, 6, 7 and 8 are screened, and 20 groups of samples are used as validation data respectively. Three models (GIQE4, NRSIQE and NORSIQE) are used to interpret and predict the remote sensing image quality of different subjective NIIRS, and the calculation deviation ∆NIIRS of each model is calculated. The prediction accuracy of the three models for different interpretation tasks was compared, and the results are shown in Figure 6.
As can be seen from Figure 6, for the quality prediction of remote sensing images with different interpretation tasks, the NORSIQE has fewer prediction errors and shows better stability than the GIQE4 and NRSIQE.
In order to quantitatively compare the prediction accuracy of the above three models, the RMSE corresponding to the three models was calculated based on the subjective NIIRS level, as shown in Table 3.
According to the data in the table, for the remote sensing image sample data of different interpretation tasks in the validation data set, the average prediction deviation of the NORSIQE is 0.231 smaller than that of the NRSIQE, and 1.775 smaller than that of the GIQE4.
For the prediction results of the three models, the statistics ∆NIIRS ≤ 0.1, 0.1 < ∆NIIRS ≤ 0.2, 0.2 < ∆NIIRS ≤ 0.3, 0.3 < ∆NIIRS ≤ 0.5 and ∆NIIRS > 0.5 corresponding to the number of image slices in the validation set, and the obtained percentage results are shown in Figure 7.
The above figure shows that 64% of the prediction deviations of the NORSIQE constructed in this paper are within 0.1 in the validation data set, and more than 95% of ∆NIIRS ≤ 0.5, enhanced by 44% and 87% compared to the prediction accuracy of NRSIQE and GIQE4, respectively. It can be seen that the prediction results of NORSIQE can show some validity for different tasks of interpreting remote sensing image information.
Meanwhile, the accuracy ratio of the model proposed in this paper, along with the MTF-Nyquist model, MTF-50 model, and MTF-Area model proposed in reference [31], was statistically analyzed according to the ∆ NIIRS interval mentioned above, and the results are shown in Figure 8.
From the figure, it can be seen that the MTF-50 model proposed in reference [31] has the best prediction accuracy of 50% for situations with ∆ NIIRS ≤ 0.1. However, in comparison, the prediction accuracy of the NORSIQE model proposed in this paper exceeds this by 14%, which has certain advantages.
The above comprehensive validation results show that introducing the TTP criterion to construct the remote sensing image quality equation and the logarithmic function which has been used to process the SNR parameters can effectively improve the regression significance and prediction accuracy of the remote sensing image quality equation. Therefore, this paper uses the NORSIQE as a model expression to quantitatively describe the interpretability of remote sensing image information, as it characterizes the perceptual properties of the human visual system on the information contained in remote sensing images while comprehensively describing the image diffusion and overall image quality degradation. It thus constitutes a new scale characterization method of remote sensing image quality interpretation based on the TTP criterion, which improves the accuracy by 87% compared with the GIQE4, and it also effectively improves the reliability of the prediction model.

5. Conclusions

In view of the unreliability and inaccuracy of the NIIRS predicted by the existing GIQE4 model due to its inability to represent human visual perception features, the TTP criterion is introduced into the image quality equation as a key parameter in this paper, which not only represents the objective image quality of remote sensing images, but also reflects the spatial integration effect of the human visual system. In this paper, a regression fit and accuracy assessment of the model are performed using a test set and validation set consisting of 320 data. Through comparative analysis, a NORSIQE model with a decision coefficient of up to 0.916 is established. Compared to the GIQE4, its prediction accuracy improves by 87% (when ∆NIIRS ≤ 0.5). This method provides a new way to evaluate the quality of remote sensing images.
In the future, we will start from the full application of remote sensing photoelectric imaging, considering the physical effects of imaging, combined with innovative technologies, to further improve the intelligence and automation of remote sensing image interpretation task prediction.

Author Contributions

Conceptualization, Y.L. and X.W.; methodology, Y.L. and X.W.; software, Y.L.; validation, Y.L.; formal analysis, Y.L. and X.W.; resources, Y.L. and X.W.; data curation, Y.L.; writing—original draft preparation, Y.L.; writing—review and editing, Y.L., C.Z and X.W.; visualization, Y.L.; supervision, X.W and C.Z. All authors have read and agreed to the published version of the manuscript.

Funding

Fundamental Research Funds for the Central Universities; National Natural Science Foundation of China (61775174, 62005204, 62005206, 62075176).

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to this comes from the cooperative research Institute and has certain privacy and confidentiality.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhou, J.; Ruwei, D.; Baihua, X. Overview of Research on Image Quality Evaluation. Comput. Sci. 2008, 35, 1–4. [Google Scholar]
  2. Lu, M. Research on Remote Sensing Image Quality Evaluation Methods; Harbin Institute of Technology: Harbin, China, 2013. [Google Scholar]
  3. Yin, L.Z.; Zhu, J.; Cai, G.L.; Wang, J.H. A review of remote sensing image quality evaluation methods. Mapp. Spat. Geogr. Inf. 2014, 37, 32–35+45. [Google Scholar]
  4. Zhang, W.; Wang, Y. High order statistics and evaluation of edge structure similarity image quality. J. Xi’an Univ. Technol. 2016, 36, 173–176. [Google Scholar]
  5. Qin, C.; Li, X. Color image quality evaluation algorithm based on frequency domain histogram and HVS. Telev. Technol. 2015, 39, 28–33. [Google Scholar]
  6. Liu, L.; Wang, Y.; Wu, Y. A Wavelet-Domain Structure Similarity for Image Quality Assessment. In Proceedings of the International Congress on Image and Signal Processing, Tianjin, China, 17–19 October 2009; IEEE: Piscataway, NJ, USA; pp. 1–5. [Google Scholar]
  7. Wei, X.; Li, J.; Chen, G. An image perception quality evaluation model. J. Comput. Aided Des. Graph. 2007, 19, 1540–1545. [Google Scholar]
  8. Liu, S.; Wu, L.; Gong, Y.; Liu, X. A review of image quality evaluation. China Sci. Technol. Pap. Online 2011, 6, 501–506+523. [Google Scholar]
  9. Jiang, G.; Huang, D.; Wang, X.; Yu, M. Research progress of image quality evaluation methods. J. Electron. Inf. 2010, 32, 219–226. [Google Scholar] [CrossRef]
  10. Leachtenauer, J.C.; Driggers, R.G. Surveillance and Reconnaissance Imaging System (Model and Performance Prediction); China Science and Technology Press: Beijing, China, 2007. [Google Scholar]
  11. Yang, F. Research on Radiation Quality Evaluation and Influencing Factors of High-Resolution Remote Sensing Images Based on Synchronous Observation; Wuhan University: Wuhan, China, 2016. [Google Scholar]
  12. Hu, A. Research on Subjective and Objective Consistency in Image Perception Quality Evaluation Methods; University of Science and Technology of China: Hefei, China, 2014. [Google Scholar]
  13. Li, L. Research on Remote Sensing Image Quality Evaluation Method Combining Subjective and Objective Methods; Nanjing University of Science and Technology: Nanjing, China, 2013. [Google Scholar]
  14. Liu, X.; Ren, X.; Zheng, Y.; Hu, T. Correlation analysis of subjective and objective evaluation of image fusion quality. J. Shenzhen Inst. Inf. Technol. 2011, 9, 23–30. [Google Scholar]
  15. Driggers, R.G.; Cox, P.; Kelley, M. National imagery interpretation rating system and the probabilities of detection, recognition, and identification. Opt. Eng. 1997, 36, 1952–1959. [Google Scholar] [CrossRef]
  16. Irvine, J.M. National imagery interpretability rating scales (NIIRS): Overview and methodology. In Airborne Reconnaissance XXI; SPIE: Cergy-Pontoise, France, 1997; Volume 3128, pp. 93–103. [Google Scholar]
  17. Shi, H.; Chen, S. A remote sensing image quality standard for user mission requirements-NIIRS. Space Return Remote Sens. 2003, 24, 30–35. [Google Scholar]
  18. Bai, H. Research on Image Quality Prediction and Evaluation Method of Remote Sensing System Based on NIIRS; Xi’an University of Electronic Science and Technology: Xi’an, China, 2010. [Google Scholar]
  19. Leachtenauer, J.C. Image Quality Equation and NIIRS. In Encyclopedia of Optical Engineering; CRC Press: Boca Raton, FL, USA, 2003; pp. 794–811. [Google Scholar]
  20. Leachtenauer, J.C.; Malila, W.; Irvine, J.; Colburn, L.; Salvaggio, N. General Image-Quality Equation: GIQE. Appl. Opt. 1997, 36, 8322–8328. [Google Scholar] [CrossRef] [PubMed]
  21. Thurman, S.T.; Fienup, J.R. Analysis of the general image quality equation. In Visual Information Processing XVI; SPIE: Cergy-Pontoise, France, 2008; Volume 6978, pp. 73350L-1–73350L-9. [Google Scholar]
  22. Irvine, J.M.; Nelson, E. Image quality and performance modeling for automated target detection. In Automatic Target Recognition XIX; SPIE: Cergy-Pontoise, France, 2009; Volume 7335, pp. 73350L-1–73350L-9. [Google Scholar]
  23. Cota, S.A.; Florio, C.J.; Duvall, D.J.; Leon, M.A. The Use of the General Image Quality Equation in the Design and Evaluation of Imaging Systems. In Remote Sensing System Engineering II; SPIE: Cergy-Pontoise, France, 2009; Volume 7458, pp. 74580H-1–74580H-20. [Google Scholar]
  24. Hindsley, R.; Rickard, L. The General Image Quality Equation and the structure of the modulation transfer function. In New Frontiers in Stellar Interferometry; SPIE: Cergy-Pontoise, France, 2004; Volume 5491, pp. 1557–1562. [Google Scholar]
  25. Harrington, L.; Blanchard, D.; Salacain, J.M.; Smith, S.J.; Amanik, P.S. General Image Quality Equation; GIQE Version 5; National Geospatial-Intelligence Agency: Springfield, VA, USA, 2015. [Google Scholar]
  26. Jason, M. Back-of-the-envelope image quality estimation using the national image interpretability rating scale: Erratum. Appl. Opt. 2019, 58, 8839. [Google Scholar]
  27. Lemaster Daniel, A. Airborne validation of the general image quality equation 5. Appl. Opt. 2020, 59, 9978–9984. [Google Scholar] [CrossRef] [PubMed]
  28. Valenzuela, A.Q.; Reyes, J.C.G. Comparative study of the different versions of the general image quality equation. SPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, IV–2/W5, 493–500. [Google Scholar] [CrossRef]
  29. Bai, J.; Sun, Y.; Chen, L.; Feng, Y.; Liu, J. EO Sensor Planning for UAV Engineering Reconnaissance Based on NIIRS and GIQE. Math. Probl. Eng. 2018, 2018, 6837014. [Google Scholar] [CrossRef]
  30. Lin, L.; Heng, L.; Haihong, Z. Estimation of the Image Interpretability of ZY-3 Sensor Corrected Panchromatic Nadir Data. Remote Sens. 2014, 6, 4409–4429. [Google Scholar] [CrossRef]
  31. Zhao, P. Evaluation of the Quality of Domestic High Resolution Images Based on GIQE. Master’s Thesis, Wuhan University, Wuhan, China, 2020. [Google Scholar]
  32. Vollmerhausen, R.H.; Jacobs, E. The Targeting Task Performance (TTP) Metric a New Model for Predicting Target Acquisition Performance; Technical Report AMSEL-NV-TR-230; US Army CERDEC: Fort Belvoir, VA, USA, 2004; p. 22060. [Google Scholar]
  33. Vollmerhausen, R.H.; Jacobs, E.; Driggers, R.G. New Metirc for Predicting Target Acquisition Performance. Opt. Eng. 2004, 43, 2806–2818. [Google Scholar] [CrossRef]
  34. He, L. Research on Target Mission Performance Evaluation Based on TTP Criteria; Xi’an University of Electronic Science and Technology: Xi’an, China, 2010. [Google Scholar]
  35. Bai, H.; Wang, X. A Quantitative Study on the Relationship between NIIRS and Target Task Performance Prediction. In Proceedings of 2011 Western Photonics Academic Conference Abstract Collection, San Francisco, CA, USA, 22–27 January 2011; Shaanxi Optical Society: Xi’an, China; High Speed Photography and Photonics Professional Committee of the Chinese Optical Society: Beijing, China, 2011; p. 22. [Google Scholar]
  36. Chen, Y.; Jin, W.; Zhao, L.; Zhao, L. A new method for infrared system evaluation based on target mission performance. Opt. Technol. 2008, 4, 555–559. [Google Scholar]
  37. Guo, L. Research on Performance Evaluation Technology of Infrared Imaging System Based on TTP Criteria; Harbin industrial University: Harbin, China, 2021. [Google Scholar]
  38. Barten Peter, G.J. The Square Root Integral (SQRI): A New Metric To Describe The Effect Of Various Display Parameters On Perceived Image Quality. In Human Vision, Visual Processing, and Digital Display; SPIE: Cergy-Pontoise, France, 1989; Volume 1077. [Google Scholar]
Figure 1. Remote sensing image quality interpretation process, where surrounded by the green dashed line is the GIQE model (it consists of target geometric characteristics, sensor performance and process action), and the two graphs represent the calculation of Relative Edge Response and Edge response overshoot.
Figure 1. Remote sensing image quality interpretation process, where surrounded by the green dashed line is the GIQE model (it consists of target geometric characteristics, sensor performance and process action), and the two graphs represent the calculation of Relative Edge Response and Edge response overshoot.
Remotesensing 15 04121 g001
Figure 2. TTP criterion principle.
Figure 2. TTP criterion principle.
Remotesensing 15 04121 g002
Figure 3. Sample of remote sensing image slices. (a) ship in port; (b) river course; (c) house and crop; (d) buildings and roads.
Figure 3. Sample of remote sensing image slices. (a) ship in port; (b) river course; (c) house and crop; (d) buildings and roads.
Remotesensing 15 04121 g003
Figure 4. Image interpretation scale fitted result plot.
Figure 4. Image interpretation scale fitted result plot.
Remotesensing 15 04121 g004
Figure 5. The error analysis of the models: (a) error analysis of the GIQE4, NRSIQE and NORSIQE; (b) error analysis of remote sensing image quality equation before and after optimization, where the orange dashed line is the trend line when the computational errors of the two equations are equal, and the blue dashed line is the trend line of the actual errors of the two equations.
Figure 5. The error analysis of the models: (a) error analysis of the GIQE4, NRSIQE and NORSIQE; (b) error analysis of remote sensing image quality equation before and after optimization, where the orange dashed line is the trend line when the computational errors of the two equations are equal, and the blue dashed line is the trend line of the actual errors of the two equations.
Remotesensing 15 04121 g005
Figure 6. The error analysis of the models: (a) the subjective NIIRS = 4; (b) the subjective NIIRS = 5; (c) the subjective NIIRS = 6; (d) the subjective NIIRS = 7; (e) the subjective NIIRS = 8.
Figure 6. The error analysis of the models: (a) the subjective NIIRS = 4; (b) the subjective NIIRS = 5; (c) the subjective NIIRS = 6; (d) the subjective NIIRS = 7; (e) the subjective NIIRS = 8.
Remotesensing 15 04121 g006aRemotesensing 15 04121 g006b
Figure 7. The ∆NIIRS distribution of the models.
Figure 7. The ∆NIIRS distribution of the models.
Remotesensing 15 04121 g007
Figure 8. The distribution of ∆NIIRS between the NORSIQE model and the MTF-Nyquist model, the MTF-50 model and the MTF-Area model.
Figure 8. The distribution of ∆NIIRS between the NORSIQE model and the MTF-Nyquist model, the MTF-50 model and the MTF-Area model.
Remotesensing 15 04121 g008
Table 1. Image raw data information.
Table 1. Image raw data information.
TypeDateTimeLocationSpatial
Resolution
Subjective
NIIRS Level
XX_12019.07.1909:24:06E_118°07′14″, N_32°27′21″6.80 inches4
XX_12019.06.2416:18:30E_119°46′41″, N_31°22′34″25.78 inches5
XX_22013.09.1509:00:00E_117°12′17″, N_23°33′42″6.99 inches7
XX_22018.07.0508:22:09E_117°39′45″, N_24°33′42″7.22 inches7
XX_22021.03.2315:22:06E_120°06′07″, N_30°57′32″2.72 inches7
XX_32018.02.0816:36:26E_120°06′07″, N_30°57′32″3.27 inches5
XX_32019.06.1413:50:50E_120°06′04″, N_30°06′04″1.46 inches6
XX_32020.02.1113:35:23E_123°03′01″, N_41°52′10″4.20 inches6
Table 2. NRSIQE and NORSIQE coefficients for GSD in inches.
Table 2. NRSIQE and NORSIQE coefficients for GSD in inches.
Equationc1c2c3c4 f(x)R2pRMSE
NRSIQE3.3341−2.72911.3091−0.2741/SNR0.7939.67 × 10−730.421
NORSIQE4.6366−3.20580.59963.5611 lg(SNR)0.9166.54 × 10−1060.204
Table 3. RMSE of GIQE4, NRSIQE and NORSIQE.
Table 3. RMSE of GIQE4, NRSIQE and NORSIQE.
Error IndicatorSubjective NIIRS LevelGIQE4NRSIQENORSIQE
RMSE42.9190.3020.164
52.2050.3400.234
62.3460.3430.259
71.3150.3790.271
81.3391.0390.324
RMSE ¯ ----2.0250.4810.250
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, Y.; Wang, X.; Zhang, C. A Remote Sensing Image Quality Interpretation Scale Characterization Method Based on the TTP Criterion. Remote Sens. 2023, 15, 4121. https://doi.org/10.3390/rs15174121

AMA Style

Li Y, Wang X, Zhang C. A Remote Sensing Image Quality Interpretation Scale Characterization Method Based on the TTP Criterion. Remote Sensing. 2023; 15(17):4121. https://doi.org/10.3390/rs15174121

Chicago/Turabian Style

Li, Yue, Xiaorui Wang, and Chao Zhang. 2023. "A Remote Sensing Image Quality Interpretation Scale Characterization Method Based on the TTP Criterion" Remote Sensing 15, no. 17: 4121. https://doi.org/10.3390/rs15174121

APA Style

Li, Y., Wang, X., & Zhang, C. (2023). A Remote Sensing Image Quality Interpretation Scale Characterization Method Based on the TTP Criterion. Remote Sensing, 15(17), 4121. https://doi.org/10.3390/rs15174121

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop