Next Article in Journal
Blur Kernel Estimation by Structure Sparse Prior
Previous Article in Journal
On the Nanoscale Mapping of the Mechanical and Piezoelectric Properties of Poly (L-Lactic Acid) Electrospun Nanofibers

Appl. Sci. 2020, 10(2), 658; https://doi.org/10.3390/app10020658

Article
Accurate and Rapid Auto-Focus Methods Based on Image Quality Assessment for Telescope Observation
1
School of Information and Communication Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
2
Laboratory of Imaging Detection and Intelligent Perception, University of Electronic Science and Technology of China, Chengdu 611731, China
3
School of Optoelectronic Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054, China
4
China air to Air Missile Research Institute, Luoyang 471099, China
*
Author to whom correspondence should be addressed.
Received: 13 December 2019 / Accepted: 13 January 2020 / Published: 16 January 2020

Abstract

:

Featured Application

In this paper, the problem of fast autofocus for the telescope observation had been solved. Based on the image quality assessment, a fast and effective method of selecting the focus window was used and multiple focus evaluation functions were compared. It is found that the method proposed in this paper can be applied to telescopes or similar optical systems.

Abstract

Aiming at improving the speed and accuracy of auto-focus for telescope observation, algorithms for image estimation and auto-focus were investigated and are discussed in this article. Based on the image quality assessment, the auto-focusing process of the telescope system is realized by using the mountain-climb search method. Several evaluation functions were tested in different scenarios. It is demonstrated that the Tenengrad image estimation function (IEF) is suitable for an instant and accurate auto-focus process of the telescope. Furthermore, we implemented sampling and dynamic adaptive focusing window (ES-DAFW) methods with the Tenengrad IEF to enhance the sensitivity and accuracy of the auto-focus process. The experimental results showed that our ES-DATW method can provide more accurate results in less time for the auto-focus process compared to the conventional approaches, especially for a sparse image. These results promise significant applications to the auto-focusing of other telescopes with image quality assessment.
Keywords:
auto-focus; telescope observation; image quality; dynamic adaptive focusing window; Tenengrad function

1. Introduction

As an important alternative to the human eyes to detect, identify, and track moving targets steadily or continuously, electro-optical tracking systems (EOTSs) have drawn interest around the world [1,2,3]. In an EOTS, in order to provide extremely accurate results within a short response time, one of the priorities is to have timely and accurate auto-focus technology (AFT) to capture the image information of the moving objects [4,5,6].
At present, there are mainly two types of AFTs [7,8]: conventional AFTs based on measuring distance, and AFTs based on digital image processing. Conventional AFTs include the image deflection method and the position sensitive detector (PSD) measurement distance method [9], which mainly depend on infrared or ultrasonic distance measurements [10,11] with transmitters and receivers, leading to increased cost of the systems. In addition, ultrasonic methods cannot auto-focus accurately on objects behind the glass [12], thus limiting its usage scenarios. Compared to the conventional AFTs, auto-focusing methods based on digital image processing (DIP-AF) have been found to have the advantages of integrated, miniaturized, and low-cost applications [13], such as digital camera, video camera, and microscope imaging [14,15]. However, little attention has been paid to AFTs with telescope systems, which are critical for capturing large amounts of image information through high-resolution CCDs in applications of astronomy observations [16,17]. To investigate the focusing performance of the telescope system, in this paper, we analyzed the effect of the image estimation function and focusing window selection, and we propose a novel adaptive dynamic focusing window selection method with high sensitivity. This paper is organized as follows: The principle of DIP-AF is firstly given in Section 2, then the image estimation functions of different methods are compared in Section 3. In Section 4, we propose a novel method to improve the detecting accuracy and speed of the DIP-AF of a telescope system. Finally, conclusions are drawn in Section 5.

2. Auto-Focus Experiment Set

Generally speaking, if a CCD camera sensor is off the focal plane of an optical system, one can only obtain defocused images with blurred edges. In the frequency domain, this means that the high-frequency components of an image with distinct edges are filtered out due to the low-frequency pass characteristics of the system [18]. To capture a clear image, the basic idea of the AFT is to adjust the lens of the optical system to a proper position so that the CCD camera sensor is well positioned at the focal plane, and abundant high-frequency (HF) components are then obtained to provide a distinct boundary contour of the image. In digital image processing techniques, an HF filter is used to obtain the HF components of the fast Fourier transform (FFT) results of the CCD image [19]. If the amplitudes of corresponding HF components are larger than some present value, then we can say that the CCD image is clear enough, i.e., auto-focus has been achieved [20].
Applying the auto-focus principle above, the AFT of a Cassegrain telescope EOTS system was investigated, as shown in Figure 1. The EOTS system mainly consists of five parts [21]: Cassegrain telescope, CCD camera, PC control module, stepper motor, and mechanical conversion module. In the experiment, a Cassegrain telescope composed of a fold back mirror structure with two primary and secondary parabolic reflectors was used to collect the information of detected objects. To enhance the imaging quality, a Schmidt corrector was mounted in front of the telescope; there was also a CCD [22] camera with a narrow band filter to detect the image and to send the filtered results to the PC control module. In the control module, based on the image estimation, a stepper motor was used to adjust the position of the primary mirror of the telescope in the control device.
In Figure 1, the auto-focus process is given as follows [23,24]: firstly, the CCD camera captures information of tracked objects through the telescope and sends the image to the PC control program. Next, the PC control program performs an estimation of the CCD image and sends commands to rotate the stepper motor in a clockwise or counter-clockwise direction. Then, a mechanical conversion module turns the rotation force into a driver that adjusts the position of the primary mirror in the telescope forward or backward, and subsequently changes the imaging position. The above processes are repeated until the CCD camera is on the image plane of the telescope so that a clear image is obtained, making this auto-focus process a closed-loop system.
The focus search algorithm determines the concrete focus process of the system. In the experiment, the hill-climbing search method was used to achieve a better focusing effect. The specific steps are shown in Figure 2. First, we initialize the starting position of the motor knob and set a large initial rotation step length, recording the initial IEF value as the maximum value. The stepper motor rotates the knob at the current step length to adjust the focal length of the system and constantly calculates the IEF value. When the IEF value starts to decrease, it means that the best focusing position has been missed, and the knob is then rotated one step backward. Then, the current step length size is halved and the above search process is repeated until the search is completed with the minimum step size (accuracy) of the stepper motor. Then, the auto-focus process can be considered as finished.

3. Comparison of Image Estimation Functions

Different scenes, as captured by the EOTS, are shown in Figure 3. In the experiment, the CCD was first adjusted to find the correct focus position for each case. Then, the stepper motor was controlled to give a fixed clockwise and counterclockwise rotation angle, leading to a fixed moving distance of the primary mirror. When the CCD camera was in front of, located at, and behind the image plane, 17 pictures were obtained continuously for each of the scenarios shown in Figure 3.
In our auto-focus process, the sharpness of the image captured by the CCD camera should be estimated firstly. Currently, the most commonly used image intelligibility estimation functions (IEF) are mainly performed in the spatial and frequency domains. Spatial domain IEFs include the Tenengrad function [25], Brenner function [26], variance function, and Laplace function [27], while the frequency domain IEFs mainly include the wavelet function and statistical gray level entropy functions [28]. Compared to blurred images, clear images have more detailed information in the spatial domain, and the gray scale gradients of adjacent pixels are relatively large. A suitable image function supports us in determining whether the image is focused or not, which is a pivotal step in the auto-focus process [29]. In order to provide an accurate comparison of the focusing sensitivity, the estimation function was normalized by the ratio of the peak deviation to the maximum value of the function and defined as
S = E m a x E m i n E m a x
where S is the normalized function value, and Emax and Emin are the maximum and minimum values, respectively, of the original estimation function.
(1) Tenengrad Function
The Tenengrad function takes the sum of the squares of image gradients as the evaluation index, and the gradients include horizontal and vertical directions [30]. The Sobel operator and image convolution are used to realize the function, which is written as
E = x y [ S ( x , y ) ] 2
where S ( x , y ) is the convolution of the pixel value of point ( x , y ) and the Sobel operator, and it can be calculated as
S ( x , y ) = G x 2 ( x , y ) + G y 2 ( x , y )
where Gx and Gy are the horizontal and vertical gradients extracted by the Sobel operator, respectively. Table 1 lists the Tenengrad function evaluation values of the focus position and the position of the front and back three steps in eight scenes. The fourth row of data presents the focus position function values.
(2) Brenner Function
The traditional Brenner function can extract the gradient information by calculating the gray difference between two adjacent pixels. It is the simplest method of gradient image evaluation. It can be expressed as
E = x y [ f ( x + 2 , y ) f ( x , y ) ] 2 .
In order to ensure the focal length of the EOTS, the field-of-view angle is very small, so even if the distance is several kilometers, the field of view of telescope observation is also very small, as only a small part of the object is the target. If this part of the object does not have obvious texture, the Brenner function only contains the gradient information in the transverse direction and does not provide statistics on the longitudinal gradient; this may cause that part of the edge information to be lost, which affects the focusing accuracy. Therefore, based on the Brenner function, the longitudinal gray gradient is added, and the calculation method is as follows [31]:
E = x y ( ( f ( x + 2 , y ) f ( x , y ) ) 2 + ( f ( x , y + 2 ) f ( x , y ) ) 2 ) .
Table 2 lists the Brenner function evaluation values of the focus position and the position of the front and back three steps in eight scenes. The fourth row of data presents the focus position function value.
(3) Variance
A clear image has a larger gray scale difference than a fuzzy image, which shows that the variance of the image is larger, so the variance function is usually used to represent the gradient information of the image. The variance function can be expressed as
E = x y [ f ( x , y ) u ] 2
where u represents the average gray level of the image, and takes the following form:
u = 1 M N x y f ( x , y )
where M and N are the image dimensions. Table 3 lists the variance function evaluation values of the focus position and the position of the front and back three steps in eight scenes. The fourth row of data presents the focus position function value.
(4) Laplace Function
The Laplace function is also a commonly used method to evaluate the gray difference in an image, including 4-domain and 8-domain Laplace templates. In order to maximize the gradient information of the image, 8-domain Laplace templates were used in the experiment and can be obtained as
E = x y | 8 f ( x , y ) f ( x 1 , y 1 ) f ( x 1 , y ) f ( x 1 , y + 1 ) f ( x , y 1 ) f ( x , y + 1 ) f ( x + 1 , y 1 ) f ( x + 1 , y ) f ( x + 1 , y + 1 ) | .
Table 4 lists the Laplace function evaluation values of the focus position and the position of the front and back three steps in eight scenes. The fourth row of data presents the focus position function value.
(5) Entropy
The entropy function is based on the premise that the clearer the image is, the more information it contains—that is, the greater the entropy of the image—so the entropy function can also be used to reflect the clarity of the image. The entropy function is calculated by
E = x y f ( x , y ) ln [ f ( x , y ) ] .
Table 5 lists the entropy function evaluation values of the focus position and the position of the front and back three steps in eight scenes. The fourth row of data presents the focus position function value.
For all these scenes, the sharpness estimations by the aforementioned IEFs of the captured image curves were calculated and are shown in Figure 4. In order to analyze the performance of each IEF, a whole-area focusing window was adopted, i.e., the whole picture was involved in the image estimation. For each subfigure in Figure 4, the abscissa and the ordinate are the picture sequence (where the ninth is the correct focus of the picture) and the normalized estimation function value, respectively.
In Figure 4a–h, the entropy function has multiple peaks, indicating the presence of instability. In Figure 4h, the Brenner function and the Laplace function also have multiple extrema which are not applicable. In terms of the unimodal property found in the corresponding eight scenarios, the Tenengrad and the variance IEFs are relatively stable. In addition, the normalized Tenengrad IEF value drops faster on both sides of the peak, indicating that the Tenengrad IEF has higher sensitivity than the variance one. In order to quantify the focusing effect of different IEFs, the efficiency factors was defined as
Ef = E m a x E m i n E m a x + E m i n
where Emax and Emin represent the maximum and minimum values of the IEF, respectively. The higher the value of Ef is, the better the focusing effect is. Table 6 lists the Ef values of the different IEFs in the eight scenarios.
Based on the analysis above, the Tenengrad IEF shows its applicability in reflecting the focusing process in a variety of scenarios, and it can be used as an effective and stable estimation algorithm in the auto-focusing process of a telescope system. It was also found that all IEFs are also related to the properties of the scenarios themselves, and the sparser the scenarios (Figure 3h), the smoother the change in the IEF value.

4. An Improved Focusing Window Selection Method

In Section 3, we applied different IEFs to a whole-picture focusing window which contained lots of information. For telescope systems, it will take a long time to calculate the IEF values during digital image processing, resulting in the loss of the high-frequency part of the distinct edge and poor real-time performance of the system [32]. Further, in Scene h of Figure 3, because the image is sparse, all IEF values show smooth features, which will affect the accuracy of auto-focusing. In order to speed up the process and increase the accuracy of auto-focusing, a dynamic adaptive focusing window selection (DAFWS) method is presented.
In the DAFWS, the number of pixels is greatly reduced during sharpness estimation, and the time consumed by the calculation can be significantly shortened [33]. In the meantime, the application of a focusing window will also improve the quality, accuracy, and sensitivity of image estimation [34]. The traditional methods of sampling focusing windows [35,36], such as center window, multiple spots window, and inverted T window, only sample the pixels in the selected window uniformly, and information outside the window is completely lost. Besides, for these traditional approaches, the position of the target in the image must be determined before applying the auto-focus process, which will lead to an additional cost of calculation time [37]. Another drawback of these conventional methods is that the sharpness scores significantly depend on whether the target is in the selected focus window. Once the actual imaging target is not in the window or in the central position of the selected window, there will be a greater discrepancy between the IEF value and the actual situation.
In order to resolve this inadequacy, an efficient and accurate method is proposed. Firstly, instead of sampling inside a certain window, the whole image is sampled evenly, and the whole gray distribution feature of the image is retained. There is no need to locate the target in advance. Second, rather than setting the focus window before starting the auto-focus process, an adaptive method is applied to provide a dynamic focusing window. The pixels are divided into two categories: ROI (region of interest) pixels and background pixels. Experiments determined that the IEF values of the ROI pixels change greatly compared to the ones of the background pixels during the focusing process. Therefore, the ROI region can be selected out of the background region and extracted as the focus window. For example, the texture and boundary on the railing of Figure 3h have a greater impact on the estimation function, and they are the ROI. The sky part, belonging to the background region, is of no concern during the focusing process. The specific process of this even sampling–DAFW (ES-DAFW) method is given as follows.
An M × N image is divided into several m × n sub-blocks. Then, the IEF value of each sub-block can be given by the following formula:
E = x = 0 m 1 y = 0 n 1 T e n e n g r a d ( x , y )
where (x, y) is the pixel coordinate in the sub-block, and E represents the Tenengrad IEF value of the image sub-block. The IEF value of each sub-block is recalculated as E′ after each rotation of the stepper motor. Then, the IEF value of each sub-block is normalized as
Δ = | E E | E
The threshold is set to u, then the sub-blocks with normalized IEF value larger than u (i.e., Δ > u ) are ROI sub-blocks, while the remaining ones are considered as background blocks. For example, in Figure 5, a blurred picture with severe defocus of the scene in Figure 3h is divided into 8 × 8 blocks. Then, we used the Tenengrad IEF to calculate the normalized value of all sub-blocks in the auto-focus process, as shown in Figure 3h, and we chose five ROI sub-blocks with the largest change in the Δ value, as shown in black boxes in Figure 5.
To verify the effectiveness of the improved dynamic focusing window, for each of the images in Figure 3, we calculated the normalized values of the Tenengrad IEF for three focus window algorithms: whole-image focus window, traditional center window, and dynamic focus window. Figure 6a–h gives a comparison of the calculated results from the three methods.
Table 7 shows the average slope of the IEF values around the peak value (positive focus position) under different window selection methods in the eight scenes. The larger the average slope, the more significant the change in the IEF value, which is helpful to improve the accuracy of the auto-focusing. From Figure 6 and Table 6, the curves of the whole-image focus window agree with the traditional center window very well for all scenarios, while the ES-DAFW method has sharper peaks than the other two methods. Especially in Figure 6h and Table 1, for Scene h, the improved performance of the ES-DAFW comes to a significant level when dealing with relatively sparse images like Figure 3h. In Figure 6h, the curves of the ES-DAFW method are much sharper than those calculated by the whole-image focus window and traditional center window methods, indicating a great improvement of the sensitivity of the IEF on both sides of the peak and, thereby, a fast auto-focus speed. We believe that in a telescope system with sparse imaging substance, this ES-DAFW would achieve a more accurate and much faster auto-focus process.
In order to compare the efficiency of different window selection methods in the auto-focusing process of the telescope EOTS system, we used whole-image focus (WIF), traditional center focus windows (TCFW), and ES-DAFW to test the auto-focus of the system ten times on the eight scenes in Figure 3, and we recorded the average time required for the completion of the auto-focus process, as shown in Table 8. It can be seen that in the case of sparse scenes, as shown in Table 8, the traditional method takes longer to complete the auto-focus process. ES-DAFW can take less time to complete the auto-focus process in all scenes, including sparse scenes, because it only uses the blocks in the image whose IEF value changes greatly. This will significantly improve the auto-focus efficiency of the telescope EOTS system.

5. Conclusions

Accurate and rapid auto-focus methods based on image quality assessment were studied in this paper, and the accuracy and sensitivity of various IEFs on different scenes were compared. Further, in order to improve the performance of IEFs, the ES-DAFW method was proposed based on the existing focus window selection method and was compared with the traditional method. Based on the above experiments and analysis, we summarize as follows:
(1)
In many image scenes collected by telescope, the Tenengrad function has high sensitivity (slope) near the peak value, which can better reflect the defocusing process of telescope system imaging. It is thus suitable as the estimation function of a telescope system;
(2)
From the experimental results of different focusing window methods, it can be seen that the ES-DAFW method can provide higher sensitivity and more accurate results for the auto-focus process, especially for a sparse image, when compared with the whole-image focusing window and the traditional center focusing window selection methods. At the same time, it has the advantages of simple calculation and can obviously shorten the time required for auto-focusing. These results promise significant applications to auto-focusing in other telescopes with EOTSs.

Author Contributions

C.Y. and M.C. were involved in leading the research project, funding acquisition, conceptualization, and writing—review and editing. W.L. designed and conducted the experiments and wrote up this work. Z.P. and F.Z. were involved in writing—review and editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Natural Science Foundation of China (61775030), Aeronautical Science Foundation of China (2016018001). It was also funded by Sichuan Science and Technology Program, (2019YJ0167 and 2019YFG0307).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Liu, Q.; Yu, W.; Mao, Y.; Zhang, J. The independent azimuth zero calibration method based on single gyro for electro-optical tracking system. In Proceedings of the International Symposium on Advanced Optical Manufacturing and Testing Technologies (AOMATT), Chengdu, China, 16 January 2019. [Google Scholar]
  2. Subbarao, M.; Choi, T.S.; Nikzad, A. Focusing Techniques. Opt. Eng. 1993, 32, 2824–2836. [Google Scholar] [CrossRef]
  3. Park, J.K.; Jung, S. A design approach of the disturbance observer for the electro-optical tracking system. In Proceedings of the 2008 International Conference on Control, Automation and Systems, Seoul, Korea, 14–17 October 2008. [Google Scholar]
  4. Li, Y.; Tang, T.L.; Huang, W. A robust auto-focus measure based on inner energy. Optoelectron. Lett. 2017, 13, 309–313. [Google Scholar] [CrossRef]
  5. Liang, H.; Lu, K.; Liu, X.; Xue, J. The Auto-focus Method for Scanning Acoustic Microscopy by Sparse Representation. Sens. Imaging 2019, 20, 33. [Google Scholar] [CrossRef]
  6. Ouyang, P.; Yin, S.; Deng, C.; Liu, L.; Wei, S. A fast face detection architecture for auto-focus in smart-phones and digital cameras. Sci. China Inf. Sci. 2016, 59, 122402. [Google Scholar] [CrossRef]
  7. Guan, H.; Niinami, N.; Liu, T. Real-time object tracking for moving target auto-focus in digital camera. Int. Soc. Opt. Photonics 2015, 9400, 940009. [Google Scholar]
  8. Kulkarni, P. Auto-focus algorithm based on statistical blur estimation. Int. Soc. Opt. Photonics 2013, 8667, 86671G. [Google Scholar]
  9. Hong, C.M.; Chen, C.M.; Kao, W.C.; Chuang, H.C.; Huang, S.H. A novel auto-focus approach utilizing discrete difference equation prediction model for digital camera. In Proceedings of the Eleventh International Fuzzy Systems Association World Congress (Volume II), Beijing, China, 28 July 2005. [Google Scholar]
  10. Di, Y.; Ye, H.; Qiu, X.; Li, T. Design of Microscopic Auto-Focusing Arithmetic with Nuclear Track Image. In Proceedings of the First International Symposium on Test Automation & Instrumentation, Beijing, China, 13 September 2006. [Google Scholar]
  11. Shi, H.; Shi, Y.; Li, X. Study on Auto-focus Methods of Optical Microscope. In Proceedings of the International Association of Computer Science and Information Technology, Hong Kong, China, 3 August 2012. [Google Scholar]
  12. Aslantas, V.; Toprak, A.N. Multi-focus image fusion based on optimal defocus estimation. Comput. Electr. Eng. 2017, 62, 302–318. [Google Scholar] [CrossRef]
  13. Xiang, X.; Li, W.; Liu, Q.; Xu, J.; Ma, Y. Model and control for inner loop of electro-optical tracking servo system. In Proceedings of the 2014 IEEE International Conference on Mechatronics and Automation (ICMA), Tianjin, China, 28 August 2014. [Google Scholar]
  14. Zhou, R.; Ding, H.; Yu, F. A real-time continuous auto-focus algorithm for stereo microscope cameras. In Proceedings of the SPIE/COS Photonics Asia, Bellingham, WA, USA, 16 December 2014. [Google Scholar]
  15. Jiang, M.S.; Yang, T.; Xu, X.L.; Zhang, X.D.; Li, F. Rapid microscope auto-focus method for uneven surfaces based on image fusion. Microsc. Res. Tech. 2019, 82, 1621–1627. [Google Scholar] [CrossRef]
  16. Zhang, T.; Wu, H.; Liu, Y.; Peng, L.; Yang, C.; Peng, Z. Infrared Small Target Detection Based on Non-Convex Optimization with Lp-Norm Constraint. Remote Sens. 2019, 11, 559. [Google Scholar] [CrossRef]
  17. Li, H.; Huang, Y.; Wang, Q.; He, D.; Peng, Z.; Li, Q. Phase Offset Tracking for Free Space Digital Coherent Optical Communication System. Appl. Sci. 2019, 9, 836. [Google Scholar] [CrossRef]
  18. Sakano, M.; Suetake, N.; Uchino, E. A Robust Point Spread Function Estimation for Out-of-Focus Blurred and Noisy Images Based on a Distribution of Gradient Vectors on the Polar Plane. Opt. Rev. 2007, 14, 297–303. [Google Scholar] [CrossRef]
  19. Chang, C.L.; Huang, K.C.; Wu, W.H.; Hsiao, W.T. Development of telescope emulator system. In Proceedings of the Instrumentation and Measurement Technology Conference (I2MTC), Binjiang, China, 10–12 May 2011. [Google Scholar]
  20. Zhang, X.; Wu, H.; Ma, Y. A new auto-focus measure based on medium frequency discrete cosine transform filtering and discrete cosine transform. Appl. Comput. Harmonic Anal. 2016, 40, 430–437. [Google Scholar] [CrossRef]
  21. Ashrafkhani, B.; Bahreini, M.; Tavassoli, S.H. Repeatability improvement of laser-induced breakdown spectroscopy using an auto-focus system. Opt. Spectrosc. 2015, 118, 841–846. [Google Scholar] [CrossRef]
  22. Hao, P.; Li, K.; Wang, Z. Design of Cassegrain-Schmidt optical system. In Proceedings of the International Symposium on Advanced Optical Manufacturing and Testing Technologies (AOMATT), Chengdu, China, 14 November 2007. [Google Scholar]
  23. Iida, T.; Yamato, H.; Jin, T.; Nomura, Y. Optimal focus evaluated using Monte Carlo simulation in non-invasive neuroimaging in the second near-infrared window. MethodsX 2019, 6, 2367–2373. [Google Scholar] [CrossRef]
  24. Hong, K.; Oh, K.J.; Choo, H.G.; Lim, Y.; Park, M. Viewing window position control on holographic projection system by electrically focused tunable lens. In Proceedings of the OPTO, San Francisco, CA, USA, 19 February 2018. [Google Scholar]
  25. Zhang, F.S.; Li, S.W.; Hu, Z.G.; Du, Z. Fish swarm window selection algorithm based on cell microscopic automatic focus. Clust. Comput. 2017, 20, 485–495. [Google Scholar] [CrossRef]
  26. Tashlinskii, A.G.; Safina, G.L.; Voronov, S.V. Pseudogradient optimization of objective function in estimation of geometric interframe image deformations. Pattern Recognit. Image Anal. 2012, 22, 386–392. [Google Scholar] [CrossRef]
  27. Lv, Z.; Jia, Y.; Zhang, Q. Joint image registration and point spread function estimation for the super-resolution of satellite images. Signal Process. Image Commun. 2017, 58, 199–211. [Google Scholar] [CrossRef]
  28. Krotkov, E. Focusing. Int. J. Comput. Vis. 1988, 1, 223–237. [Google Scholar] [CrossRef]
  29. He, J. Modified fast climbing search auto-focus algorithm with adaptive step size searching technique for digital camera. IEEE Trans. Consum. Electron. 2003, 49, 257–262. [Google Scholar] [CrossRef]
  30. Gao, S.; Han, M.; Cheng, X. The fast iris image clarity evaluation based on Tenengrad and ROI selection. In Proceedings of the International Conference on Graphic and Image Processing, Qingdao, China, 10 April 2018. [Google Scholar]
  31. Bi, T.; Du, W. An improved Brenner definition evaluation function. Electron. Meas. Technol. 2019, 42, 80–84. [Google Scholar]
  32. Chen, Z.; Zhang, T. Realization of auto-focusing system for cameras based on TMS320F2812 DSP. In Proceedings of the 2011 International Conference on Electrical and Control Engineering (ICECE), Yichang, China, 16–18 September 2011. [Google Scholar]
  33. Bos, J.P.; Roggemann, M.C. Estimation of the atmospheric blurring function using blind image quality metrics. In Proceedings of the Optics & Photonics-Optical Engineering + Applications, San Diego, CA, USA, 24 October 2012. [Google Scholar]
  34. Dong, B.; Jin, R.; Weng, G. Active contour model based on local bias field estimation for image segmentation. Signal Process. Image Commun. 2019, 78, 187–199. [Google Scholar] [CrossRef]
  35. Bigelow, T.A.; O’Brien, W.D., Jr. Scatterer size estimation using a generalized ultrasound attenuation-compensation function to correct for focusing. In Proceedings of the 2003 IEEE Symposium on Ultrasonics, Honolulu, HI, USA, 5–8 October 2003. [Google Scholar]
  36. Lee, J.S.; Jung, Y.Y.; Kim, B.S.; Ko, S.J. An advanced video camera system with robust AF, AE, and AWB control. IEEE Trans. Consum. Electron. 2001, 47, 694–699. [Google Scholar]
  37. Xia, Y.; Bao, Q.; Liu, Z. A New Disturbance Feedforward Control Method for Electro-Optical Tracking System Line-Of-Sight Stabilization on Moving Platform. Sensors 2018, 18, 4350. [Google Scholar] [CrossRef] [PubMed]
Figure 1. (a) Auto-focus diagram of a Cassegrain telescope system; (b) Hardware experiment system.
Figure 1. (a) Auto-focus diagram of a Cassegrain telescope system; (b) Hardware experiment system.
Applsci 10 00658 g001aApplsci 10 00658 g001b
Figure 2. Auto-focus search process. IEF, image estimation function.
Figure 2. Auto-focus search process. IEF, image estimation function.
Applsci 10 00658 g002
Figure 3. Different experimental scenes of the electro-optical tracking system (EOTS) auto-focus: (a) Windows of distant buildings, (b) Air conditioner outdoor unit, (c) Billboard, (d) Pipes on the roof, (e) Tower crane at a construction site, (f) Roof decoration of tall buildings, (g) Leaves in the distance, (h) Top railing of a building.
Figure 3. Different experimental scenes of the electro-optical tracking system (EOTS) auto-focus: (a) Windows of distant buildings, (b) Air conditioner outdoor unit, (c) Billboard, (d) Pipes on the roof, (e) Tower crane at a construction site, (f) Roof decoration of tall buildings, (g) Leaves in the distance, (h) Top railing of a building.
Applsci 10 00658 g003
Figure 4. Comparison of different IEFs, corresponding to the scenarios shown in Figure 3.
Figure 4. Comparison of different IEFs, corresponding to the scenarios shown in Figure 3.
Applsci 10 00658 g004aApplsci 10 00658 g004b
Figure 5. Dynamic focus window for the scene in Figure 3h.
Figure 5. Dynamic focus window for the scene in Figure 3h.
Applsci 10 00658 g005
Figure 6. Comparison of three auto-focusing window selection methods: WIF (whole-image focus), TCFW (traditional center window), and ES-DAWF (even sampling–dynamic adaptive focusing window selection).
Figure 6. Comparison of three auto-focusing window selection methods: WIF (whole-image focus), TCFW (traditional center window), and ES-DAWF (even sampling–dynamic adaptive focusing window selection).
Applsci 10 00658 g006
Table 1. The change of the Tenengrad function during focusing in different scenes.
Table 1. The change of the Tenengrad function during focusing in different scenes.
Scene aScene bScene cScene dScene eScene fScene gScene h
0.53730.81200.78460.84070.81250.75760.77730.9630
0.62680.85240.83450.91890.88890.80330.84300.9673
0.99060.97800.93280.97350.97480.90480.97260.9782
1.00001.00001.00001.00001.00001.00001.00001.0000
0.79470.96580.84370.79720.95300.82740.96370.9871
0.63900.92340.75340.70910.89950.71470.84390.9637
0.55640.84700.69280.65220.80220.65380.76380.9528
Table 2. The change of the Brenner function during focusing in different scenes.
Table 2. The change of the Brenner function during focusing in different scenes.
Scene aScene bScene cScene dScene eScene fScene gScene h
0.74300.94860.87420.85370.89260.94780.88350.9784
0.76360.95220.88510.92360.92340.95850.91270.9922
0.96850.99870.92950.94120.95370.96350.96480.9993
1.00001.00001.00001.00001.00001.00001.00001.0000
0.84690.96570.90910.93180.96180.96970.97300.9458
0.79620.92740.86110.86620.90620.92560.91230.9176
0.74060.84860.84280.74750.82750.89250.84030.8992
Table 3. The change of the variance function during focusing in different scenes.
Table 3. The change of the variance function during focusing in different scenes.
Scene aScene bScene cScene dScene eScene fScene gScene h
0.82530.91820.88790.85340.88420.88420.89940.9681
0.85250.93420.90380.88420.92340.90550.91460.9842
0.97150.98320.94370.97140.93890.95930.97620.9973
1.00001.00001.00001.00001.00001.00001.00001.0000
0.91940.97410.91940.93420.97340.92940l95370.9748
0.86210.93250.88560.87920.8630.89130.87740.9695
0.84230.88320.87230.83440.78410.87250.81220.9599
Table 4. The change of the Laplace function during focusing in different scenes.
Table 4. The change of the Laplace function during focusing in different scenes.
Scene aScene bScene cScene dScene eScene fScene gScene h
0.66420.96580.84430.74250.88920.84420.75590.9536
0.73540.99040.86510.79350.91170.86390.82480.9810
0.85431.00000.93180.90510.96100.93170.91460.9983
1.00000.98351.00001.00001.00001.00001.00001.0000
0.85230.97340.90730.87540.96260.92520.92360.9881
0.73320.92560.84260.76400.91730.86310.86480.9689
0.69420.90520.79320.72230.82460.79800.72260.9677
Table 5. The change of the entropy function during focusing in different scenes.
Table 5. The change of the entropy function during focusing in different scenes.
Scene aScene bScene cScene dScene eScene fScene gScene h
0.97220.98100.91740.94850.95570.93470.95270.9510
0.98810.95830.92460.94770.94850.96380.97360.9659
0.99010.98310.90500.95310.96020.9750098010.9483
1.00000.96771.00000.93350.96471.00001.00000.9631
0.95980.94320.93780.95461.00000.93450.97340.9577
0.96260.92590.88930.95390.95200.91300.95260.9731
0.97620.93760.87340.96650.96630.87730.92570.9677
Table 6. Ef values of different IEFs in the eight scenarios.
Table 6. Ef values of different IEFs in the eight scenarios.
TenengradBrennerVarianceLaplaceEntropy
Scene a0.3710.1560.1310.2500.027
Scene b0.2120.2040.1830.1360.050
Scene c0.2730.1040.1110.1560.049
Scene d0.3260.2110.1610.2440.035
Scene e0.2890.2150.2140.2400.080
Scene f0.3330.1120.1390.2410.085
Scene g0.3210.1860.1830.2930.063
Scene h0.0830.0920.0580.0610.032
Table 7. The average slopes of the three window selection methods around the focus position in different scenes.
Table 7. The average slopes of the three window selection methods around the focus position in different scenes.
WIFTCFWES-DAFW
Scene a0.1830.130.316
Scene b0.0710.0750.161
Scene c0.1030.0540.129
Scene d0.0930.050.174
Scene e0.0530.0550.146
Scene f0.1200.0350.155
Scene g0.0710.0750.101
Scene h0.0180.0120.047
Table 8. The average times taken by different window selection methods to complete auto-focusing.
Table 8. The average times taken by different window selection methods to complete auto-focusing.
WIF(s)TCFW(s)ES-DAFW(s)
Scene a8.366.373.27
Scene b7.946.583.54
Scene c8.126.144.79
Scene d6.375.243.71
Scene e9.287.644.39
Scene f7.837.144.03
Scene g9.286.395.17
Scene h12.4110.756.32
Back to TopTop