Next Article in Journal
Intelligent Reflecting Surface-Assisted Secure Multi-Input Single-Output Cognitive Radio Transmission
Previous Article in Journal
Repeatability of High-Pressure Measurement in a Diesel Engine Test Bed
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Letter

Implementation and Optimization of a Dual-confocal Autofocusing System

1
Metal Industries Research and Development Centre, Kaohsiung City 81160, Taiwan
2
Department of Mechanical Engineering, National Cheng Kung University, Tainan City 70101, Taiwan
3
Department of Mechanical Engineering, National Chung Cheng University, Chiayi County 62102, Taiwan
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(12), 3479; https://doi.org/10.3390/s20123479
Submission received: 18 May 2020 / Revised: 11 June 2020 / Accepted: 17 June 2020 / Published: 19 June 2020
(This article belongs to the Section Optical Sensors)

Abstract

:
This paper describes the implementation and optimization of a dual-confocal autofocusing system that can easily describe a real-time position by measuring the response signal (i.e., intensity) of the front and the rear focal points of the system. This is a new and systematic design strategy that would make it possible to use this system for other applications while retrieving their characteristic curves experimentally; there is even a good chance of this technique becoming the gold standard for optimizing these dual-confocal configurations. We adopt two indexes to predict our system performance and discover that the rear focal position and its physical design are major factors. A laboratory-built prototype was constructed and demonstrated to ensure that its optimization was valid. The experimental results showed that a total optical difference from 150 to 400 mm significantly affected the effective volume of our designed autofocusing system. The results also showed that the sensitivity of the dual-confocal autofocusing system is affected more by the position of the rear focal point than the position of the front focal point. The final optimizing setup indicated that the rear focal length and the front focal length should be set at 200 and 100 mm, respectively. In addition, the characteristic curve between the focus error signal and its position could successfully define the exact position by a polynomial equation of the sixth order, meaning that the system can be straightforwardly applied to an accurate micro-optical auto-focusing system.

1. Introduction

Due to its good reliability, high throughput, and relatively low cost, machine vision systems are an attractive solution for the inspection process in automated mass production lines. In practice, such systems always need a highly precise autofocusing capability to obtain sufficiently sharp images of the object of interest [1,2,3,4]. Autofocusing systems have been widely applied in recent decades in a variety of industrial manufacturing and measurement fields, such as in cellphone camera modules, automatically available optical inspection, and dynamic tracking systems [4]. In brief, many autofocusing systems have been developed, and these autofocusing systems can be broadly classified as image-based methods [5,6,7,8,9,10,11,12,13,14,15,16] and optics-based methods [17,18,19,20,21,22,23,24,25,26,27,28,29,30]. Both need to be driven by moving motors to achieve the function of autofocusing, which limits the possibility of their direct implementation for in-line inspection.
In an image-based autofocusing system, the position of the focusing objective lens is determined by capturing real images through a Charge-coupled Device (CCD) or Complementary Metal-Oxide-Semiconductor (CMOS) device with the use of complex image processing. The performance of the image-based autofocusing system depends on the sharpness of the captured images or the image spatial frequency function, which is used to calculate the focus value (FV) in the system. By hill-climbing searching, determining the peak value can help to determine the required focus based on a simple configuration, which is easy to handle and stable but rather time-consuming and has a very limited effective depth of focus. We have found that many research articles [31,32,33,34,35,36,37] have investigated the improvement of the algorithm using software and tried to make the application of the technique to the product line possible.
In the optics-based autofocusing method, the triangle geometric and dual-confocal configurations are widely used in the system. The triangle geometric technologies are well-known optical methods which are applied in many inspection instruments [38,39,40]. Our research group has published many methods to improve the focusing accuracy and response of the optics-based autofocusing systems with triangle geometric technologies [27,28,29,30].
Accordingly, we claim to have developed a novel and easily constructed autofocusing system by adopting a dual-confocal configuration. By utilizing a differential mode, we deal with the spatial optical intensity distribution from the rear and the front focal planes, retrieving two focus error signals. Furthermore, our system can directly retrieve the location of the specific focal plane. Meanwhile, we focus on an implementation in this paper in combination with an optimization method based on the optical autofocusing system. The commercially available software (ZEMAX) was used to verify our experimental setup. The system has the significant benefit of high accuracy, a fast response time, and the retrieval of the exact moving direction while tracking the front and rear focal planes, and there is tremendous potential for applying our system in mini-optomechanics with the required rapid autofocusing technique. Overall, we not only demonstrate the laboratory-built prototype in our study, but also show numerical analyses employed to determine the optimal design parameters of our proposed dual-confocal autofocusing system.

2. Methods

The image-based autofocusing system, which captures real images through CCD or CMOS devices with the cooperation of complex image processing, is a classic and well-known technique. Calculating the sharpness of captured images or the FV independently of the image spatial frequency function in the system can indicate the position of the focusing objective lens while in-focus. Chen et al. [7] first discussed an image-based autofocusing system and determined the peak value, which could help to determine focusing based on a simple configuration by hill-climbing searching, which is easy to handle and stable; however, it is rather time-consuming and has a very limited effective depth of focus. They adopted the discrete wavelet transform (DWT) method to perform the sharpness measurement and for further validation.
The fully digital autofocusing (FDAF) method was shown by Jeon et al. [31] to obtain a fast and precise autofocusing module by searching the focusing area automatically in cooperation with the point spread function (PSF). In 2011, Koh et al. claimed a configuration adopting two low-pass filters and double apertures based on the capture of two monochromic images to be able to denoise and distinguish its defocus direction and position through the variance of gradient magnitude (VGM) [32].
In 1993, Yamada et al. [33] developed a patent including a new configuration which was able to resolve the problem of distinguishing the rear focusing beam from the front one under a high-power zoom lens based on the creation of an optical difference by utilizing a switch device and a prism. They only needed to capture these two images to obtain sufficient information, while the exact focal plane was located between the front focusing beam and the rear focusing beam. This represented almost the first concept of the dual-confocal autofocusing system and easily handled the determination of the focusing position and its moving direction.
The optical-based configuration includes photodetectors (e.g., CCD, CMOS, photomultiplier tube (PMT)) as information detection systems to determine the shape of a laser spot and its intensity to estimate focusing, which is highly precise and quick to perform based on the position error signal (or focus error signal). In 2010, Wang et al. developed a femtosecond laser machining system by adopting the autofocusing module in combination with the dual-confocal system, as shown in Figure 1. The results of their simulations and experiments demonstrated that the values of positioning accuracy and repeatability are both less than ±1.5 μm within the measuring range of ±200 μm based on the respective response intensity of dual near-focusing position [41]. This is also an example of the above-mentioned dual-confocal system configuration being applied for an autofocusing function.
Using the dual-confocal configuration, we propose an optics-based autofocusing system scheme in this paper, as shown in Figure 2a. Meanwhile, the response intensity of the focusing or defocusing position is able to be obtained by two photodetector (PD) signals of the front and rear focal planes, which could be used to calibrate and determine the exact focusing position and its direction. Moreover, we provide a characteristic curve of the focus error signal obtained by the two differential PD signals. Therefore, this optics-based autofocusing system simply utilizes the well-set PDs to retrieve the real-time intensity signal near the focal plane, which can directly indicate the focusing position by simultaneously calculating the merit function of the intensity distribution. Our proposed optics-based autofocusing system has the advantages of being highly precise and having a rapid response time, enabling the module to be easily imported to the production line. Compared with the conventional confocal system, our proposed system can obtain the moving direction information only by retrieving the intensity near the focal plane, i.e., at the exact positions indicated. According to the light intensity distribution of Gaussian focusing, the two parts of the original signal (i.e., intensity) detected from the PDs at the different focal positions will indicate two characteristic curves. The location of the focal plane is indicated by comparing the intersection of the two spatial optical intensity distribution curves with the rear and front defocus distance, respectively, as shown in Figure 2b. Thus, we can obtain the position of the focal plane accompanied with its moving direction, and we only need to calculate the exact optical difference between two light beams for the front focusing and rear focusing. In brief, we propose a novel dual-confocal configuration for an optics-based autofocusing microscope, to be used instead of a conventional confocal system or a centroid knife-edge method. This configuration boasts a simple scheme without redundant moving by trial and error; it has several significant characteristics enabling fast scanning, precise position tracking, and a low cost. Key points related to the above-mentioned methods are shown in Table 1.

3. System Implementation

Figure 3 illustrates the brief configuration of our proposed system, including a laser diode, a collimator, the first beam splitter (BS), a microscope module, the second BS, and two pinholes (which set at the front focal plane and at the rear focal plane) that cooperate with the photodetectors (PDs). The light source we adopted was a laser diode (wavelength of 633 nm) made by Thorlabs (HL6501MG). The microscope module we utilized had an objective of 10× (Olympus Co., f = 18 mm) and can cover the scanning area of about 2 mm × 2 mm, which is sufficient an for optics-based autofocusing system with single-point scan application. Table 2 thus shows all key modules adopted in our proposed dual-confocal microscopic system. For comparison with the simulation results (considering the total optical difference from 150 to 400 mm while comparing with the objective of the microscope module), we constructed a prototype of the dual-confocal configuration with a rear focal length of 100 mm and a front focal length of 200 mm. Furthermore, the characteristic curve between focus error signal (FES) and its position can successfully define the exact focusing position by a sixth-order polynomial equation.

4. Results

Our proposed configuration was characterized numerically by using the commercially available software (ZEMAX), and then it was verified experimentally using a laboratory-built prototype, as shown in Figure 4. Figure 5 shows the simulation data obtained by the two PDs, which means optical intensity retrieved at the front site and rear site of the focal planes in our proposed system. The both sites setup were chosen on the basis of the similarly symmetrical curves while obtaining single optical intensity distribution versus the defocus distance. The optimal parameters included the four independent parameters, with the primary importance being placed on the rear focal point. The maximum error of our autofocusing system could be evaluated on the basis of several tests (see Table 3). The FES, which can be determined by differential modes of two signals, should be directly be directly dependent on the focusing position, as shown in Figure 6. We constructed an approximation formula refereed to the results obtained from the FES and focusing position. More specifically, the raw data obtained from the PDs can be easily used to calculate and indicate the exact position real-time.
We demonstrated that the FES calculation can indicate the position through measuring the intensity by two PDs because of the theoretically predictive intensity distribution. Considering the signal variability near the focal position and far away from the focal plane, the minimum FES would be 1.6 μm under the available signal limitation (as shown in Figure 7). Furthermore, the data retrieval performance at the rear focal point is better than at the front one, so we discuss how the relationships between the signal variability (μm) and several parameters (e.g., pinhole position, pinhole size, and effective focal length (EFL)) can be determined by the characteristic curves of the front and rear focal points (as shown in Figure 8). Here, we define the slope and error position as the two set indexes of the characteristic curve, as shown in Figure 9, which represent the signal transfer error and its available linear sensitivity. A low ΔX index when considering the minimum electric signal error is indicative of the potential of our focusing system for highly precise positioning. The slope index values (Slope1 & 2) can be determined from the characteristic curve near the focal plane and away from the focal position, which indicated the linear transformation from the FES calculation well.
Table 3 shows our simulation data, including thirty-one sets of the above-mentioned indexes, and the optimal parameters of the rear focal point that we chose in our system. Finally, the optimal pinhole size is 75 μm, the optimal pinhole distance from the objective is 113.5 mm, and the optimal rear focal length is about 100 mm (as shown in Figure 10). Accordingly, the results shown in Figure 10 demonstrate that the measuring sensitivity of the rear focal point is better than that of the front focal point. We determined the optimal setup of the rear focal point first in order to retrieve the ΔX1 and ΔX2 indexes. Furthermore, we also adopted the following optimal parameters for the front focal point: pinhole size of 400 μm, pinhole distance from the objective of 35.6 mm, and rear focal length of about 200 mm. According to Figure 6, a mirror-like curve is constructed with opposite the x-axis and the y-axis, which is similar to the curve of the rear focal point; this allows the FES to be calculated and its position to be indicated directly.
Our proposed optical configuration was suitably designed and constructed to retrieve an FES from the rear focal plane and another FES from the front focal plane. To indicate the location of the specific focal plane, the relationship between the FES and the defocus distance is adopted and its spatial optical intensity distribution is calculated. In this manuscript, we have proposed and developed a novel and easily constructed autofocusing system.

5. Conclusions

This paper successfully describes a new design rule for choosing the optimal system parameters for obtaining the lowest signal variability. For the front and rear focal points, we defined the values of the focal length, pinhole position, and pinhole size in a step-by-step process. Our results also showed how the total optical difference from 150 to 400 mm significantly affected the effective volume of our designed autofocusing system. This finding should be considered carefully when integrating this module into the whole system. We adopted two indexes to predict our system performance and discovered that the rear focal position and its physical design are major factors that directly affect the accuracy of an autofocusing system based on dual-confocal configuration accompanied with sufficient dynamic range.

Author Contributions

Conceptualization, C.-M.J., C.-S.L., and J.-Y.Y.; methodology, C.-S.L. and J.-Y.Y.; validation, J.-Y.Y.; writing—original draft preparation, C.-M.J.; writing—review and editing, C.-S.L.; supervision, project administration, and funding acquisition, C.-S.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Ministry of Science and Technology of Taiwan, grant numbers MOST 105-2221-E-006-265-MY5, 106-2628-E-006-010-MY3, and 108-2218-E-002-071.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Liu, C.S.; Lin, Y.C.; Hu, P.H. Design and characterization of precise laser-based autofocusing microscope with reduced geometrical fluctuations. Microsyst. Technol. 2013, 19, 1717–1724. [Google Scholar] [CrossRef]
  2. Wu, L.; Wen, G.; Wang, Y.; Huang, L.; Zhou, J. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing. Sensors 2018, 18, 595. [Google Scholar] [CrossRef] [Green Version]
  3. Kang, J.H.; Lee, C.B.; Joo, J.Y.; Lee, S.K. Phase-locked loop based on machine surface topography measurement using lensed fibers. Appl. Opt. 2011, 50, 460–467. [Google Scholar] [CrossRef] [PubMed]
  4. Liu, C.S.; Jiang, S.H. Design and experimental validation of novel enhanced-performance autofocusing microscope. Appl. Phys. B 2014, 117, 1161–1171. [Google Scholar] [CrossRef]
  5. Lee, S.; Lee, J.Y.; Yang, W.; Kim, D.Y. Autofocusing and edge detection schemes in cell volume measurements with quantitative phase microscopy. Opt. Express 2009, 17, 6476–6486. [Google Scholar] [CrossRef] [PubMed]
  6. Chang, H.C.; Shih, T.M.; Chen, N.Z.; Pu, N.W. A microscope system based on bevel-axial method auto-focus. Opt. Lasers Eng. 2009, 47, 547–551. [Google Scholar] [CrossRef]
  7. Chen, C.Y.; Hwang, R.C.; Chen, Y.J. A passive auto-focus camera control system. Appl. Soft Comput. 2010, 10, 296–303. [Google Scholar] [CrossRef]
  8. Bueno-Ibarra, M.A.; Alvarez-Borrego, J.; Acho, L.; Chavez-Sanchez, M.C. Fast autofocus algorithm for automated microscopes. Opt. Eng. 2005, 44, 063601-1–063601-8. [Google Scholar]
  9. Lee, J.H.; Kim, Y.S.; Kim, S.R.; Lee, I.H.; Pahk, H.J. Real-time application of critical dimension measurement of TFT-LCD pattern using a newly proposed 2D image-processing algorithm. Opt. Lasers Eng. 2008, 46, 558–569. [Google Scholar] [CrossRef]
  10. Brazdilova, S.L.; Kozubek, M. Information content analysis in automated microscopy imaging using an adaptive autofocus algorithm for multimodal functions. J. Microsc. 2009, 236, 194–202. [Google Scholar] [CrossRef]
  11. Yazdanfar, S.; Kenny, K.B.; Tasimi, K.; Corwin, A.D.; Dixon, E.L.; Filkins, R.J. Simple and robust image-based autofocusing for digital microscopy. Opt. Express 2008, 16, 8670–8677. [Google Scholar] [CrossRef] [PubMed]
  12. Wright, E.F.; Wells, D.M.; French, A.P.; Howells, C.; Everitt, N.M. A low-cost automated focusing system for time-lapse microscopy. Meas. Sci. Technol. 2009, 20, 027003-1–027003-4. [Google Scholar] [CrossRef]
  13. Kim, T.; Poon, T.C. Autofocusing in optical scanning holography. Appl. Opt. 2009, 48, H153–H159. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Moscaritolo, M.; Jampel, H.; Knezevich, F.; Zeimer, R. An image based auto-focusing algorithm for digital fundus photography. IEEE Trans. Med. Imaging 2009, 28, 1703–1707. [Google Scholar] [CrossRef] [PubMed]
  15. Shao, Y.; Qu, J.; Li, H.; Wang, Y.; Qi, J.; Xu, G.; Niu, H. High-speed spectrally resolved multifocal multiphoton microscopy. Appl. Phys. B 2010, 99, 633–637. [Google Scholar] [CrossRef]
  16. Abdullah, S.J.; Ratnam, M.M.; Samad, Z. Error-based autofocus system using image feedback in a liquid-filled diaphragm lens. Opt. Eng. 2009, 48, 123602-1–123602-9. [Google Scholar] [CrossRef]
  17. Jung, B.J.; Kong, H.J.; Jeon, B.G.; Yang, D.Y.; Son, Y.; Lee, K.S. Autofocusing method using fluorescence detection for precise two-photon nanofabrication. Opt. Express 2011, 19, 22659–22668. [Google Scholar] [CrossRef] [Green Version]
  18. Zhang, P.; Prakash, J.; Zhang, Z.; Mills, M.S.; Efremidis, N.K.; Christodoulides, D.N.; Chen, Z. Trapping and guiding microparticles with morphing autofocusing Airy beams. Opt. Lett. 2011, 36, 2883–2885. [Google Scholar] [CrossRef]
  19. Wang, S.H.; Tay, C.J.; Quan, C.; Shang, H.M.; Zhou, Z.F. Laser integrated measurement of surface roughness and micro-displacement. Meas. Sci. Technol. 2000, 11, 454–458. [Google Scholar] [CrossRef]
  20. Fan, K.C.; Chu, C.L.; Mou, J.I. Development of a low-cost autofocusing probe for profile measurement. Meas. Sci. Technol. 2001, 12, 2137–2146. [Google Scholar] [CrossRef]
  21. Tanaka, Y.; Watanabe, T.; Hamamoto, K.; Kinoshita, H. Development of nanometer resolution focus detector in vacuum for extreme ultraviolet microscope. Jpn. J. Appl. Phys. 2006, 45, 7163–7166. [Google Scholar] [CrossRef]
  22. Li, Z.; Wu, K. Autofocus system for space cameras. Opt. Eng. 2005, 44, 053001-1–053001-5. [Google Scholar] [CrossRef]
  23. Rhee, H.G.; Kim, D.I.; Lee, Y.W. Realization and performance evaluation of high speed autofocusing for direct laser lithography. Rev. Sci. Instrum. 2009, 80, 073103-1–073103-5. [Google Scholar] [CrossRef] [PubMed]
  24. He, M.; Zhang, W.; Zhang, X. A displacement sensor of dual-light based on FPGA. Optoelectron. Lett. 2007, 3, 294–298. [Google Scholar] [CrossRef]
  25. Kim, K.H.; Lee, S.Y.; Kim, S.; Jeong, S.G. DNA microarray scanner with a DVD pick-up head. Curr. Appl. Phys. 2008, 8, 687–691. [Google Scholar] [CrossRef]
  26. Liu, C.S.; Jiang, S.H. A novel laser displacement sensor with improved robustness toward geometrical fluctuations of the laser beam. Meas. Sci. Technol. 2013, 24, 105101-1–105101-8. [Google Scholar] [CrossRef]
  27. Liu, C.S.; Hu, P.H.; Lin, Y.C. Design and experimental validation of novel optics-based autofocusing microscope. Appl. Phys. B 2012, 109, 259–268. [Google Scholar] [CrossRef]
  28. Liu, C.S.; Wang, Z.Y.; Chang, Y.C. Design and characterization of high-performance autofocusing microscope with zoom in/out functions. Appl. Phys. B 2015, 121, 69–80. [Google Scholar] [CrossRef]
  29. Liu, C.S.; Jiang, S.H. Precise autofocusing microscope with rapid response. Opt. Lasers Eng. 2015, 66, 294–300. [Google Scholar] [CrossRef]
  30. Liu, C.S.; Song, R.C.; Fu, S.J. Design of a laser-based autofocusing microscope for a sample with a transparent boundary layer. Appl. Phys. B 2019, 125, 199. [Google Scholar] [CrossRef]
  31. Jeon, J.; Yoon, I.; Kim, D.; Lee, J.; Paik, J. Fully digital auto-focusing system with automatic focusing region selection and point spread function estimation. IEEE Trans. Magn. 2010, 56, 1204–1210. [Google Scholar] [CrossRef]
  32. Koh, K.; Kuk, J.G.; Jin, B.; Choiand, W.; Cho, N.I. Autofocus method using dual aperture and color filters. J. Electron. Imaging 2011, 20, 033002-1–033002-6. [Google Scholar] [CrossRef]
  33. Yamana, M. Automatic Focal-Point Sensing Apparatus Sensing High and Low Magnification. U.S. Patent 5245173, 30 November 1993. [Google Scholar]
  34. Xu, X.; Wang, Y.; Tang, J.; Zhang, X.; Liu, X. Robust Automatic Focus Algorithm for Low Contrast Images Using a New Contrast Measure. Sensors 2011, 11, 8281–8294. [Google Scholar] [CrossRef] [Green Version]
  35. Aoyama, T.; Takeno, S.; Takeuchi, M.; Hasegawa, Y. Head-Mounted Display-Based Microscopic Imaging System with Customizable Field Size and Viewpoint. Sensors 2020, 20, 1967. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Yang, C.; Chen, M.; Zhou, F.; Li, W.; Peng, Z. Accurate and Rapid Auto-Focus Methods Based on Image Quality Assessment for Telescope Observation. Appl. Sci. 2020, 10, 658. [Google Scholar] [CrossRef] [Green Version]
  37. Werner, T.; Carrasco, J. Validating Autofocus Algorithms with Automated Tests. Robotics 2018, 7, 33. [Google Scholar] [CrossRef] [Green Version]
  38. Zheng, F.; Zhang, B.; Gao, R.; Feng, Q. A High-Precision Method for Dynamically Measuring Train Wheel Diameter Using Three Laser Displacement Transducers. Sensors 2019, 19, 4148. [Google Scholar] [CrossRef] [Green Version]
  39. Zheng, F.; Feng, Q.; Zhang, B.; Li, J. A Method for Simultaneously Measuring 6DOF Geometric Motion Errors of Linear and Rotary Axes Using Lasers. Sensors 2019, 19, 1764. [Google Scholar] [CrossRef] [Green Version]
  40. Chen, Y.-T.; Huang, Y.-S.; Liu, C.-S. An Optical Sensor for Measuring the Position and Slanting Direction of Flat Surfaces. Sensors 2016, 16, 1061. [Google Scholar] [CrossRef] [Green Version]
  41. Wang, Y.H.; Hu, P.H.; Lin, Y.C.; Ke, S.S.; Chang, Y.H.; Liu, C.S.; Hong, J.B. Dual-confocal auto-focus sensing system in ultrafast laser application. In Proceedings of the 2010 IEEE Sensors, Kona, HI, USA, 1–4 November 2010; pp. 486–489. [Google Scholar]
Figure 1. The schematic diagram of the dual-confocal system [41].
Figure 1. The schematic diagram of the dual-confocal system [41].
Sensors 20 03479 g001
Figure 2. Proposed dual-confocal autofocusing system: (a) schematic illustration and (b) simulation of the front and the rear focal positions (intensity distribution).
Figure 2. Proposed dual-confocal autofocusing system: (a) schematic illustration and (b) simulation of the front and the rear focal positions (intensity distribution).
Sensors 20 03479 g002
Figure 3. Schematic illustration of our proposed dual-confocal system.
Figure 3. Schematic illustration of our proposed dual-confocal system.
Sensors 20 03479 g003
Figure 4. Photograph of the laboratory-built prototype.
Figure 4. Photograph of the laboratory-built prototype.
Sensors 20 03479 g004
Figure 5. Simulation of the intensity signals from the two photodetectors.
Figure 5. Simulation of the intensity signals from the two photodetectors.
Sensors 20 03479 g005
Figure 6. The relationship between the focus error signal (FES) and the defocus distance.
Figure 6. The relationship between the focus error signal (FES) and the defocus distance.
Sensors 20 03479 g006
Figure 7. The distribution of signal variability versus defocus distance.
Figure 7. The distribution of signal variability versus defocus distance.
Sensors 20 03479 g007
Figure 8. Simulation results of the variation in sensitivity of the autofocusing system depending on (a) the position of rear focal point and (b) the position of front focal point.
Figure 8. Simulation results of the variation in sensitivity of the autofocusing system depending on (a) the position of rear focal point and (b) the position of front focal point.
Sensors 20 03479 g008
Figure 9. The key index of the characteristic curve in our focusing system.
Figure 9. The key index of the characteristic curve in our focusing system.
Sensors 20 03479 g009
Figure 10. Our simulation of the optimal parameters of the rear focal point.
Figure 10. Our simulation of the optimal parameters of the rear focal point.
Sensors 20 03479 g010
Table 1. Comparison of prior autofocusing system studies.
Table 1. Comparison of prior autofocusing system studies.
Development TeamKey PointsCompared with Our Proposed SystemReference
Chen et al.
  • Image-based
  • Utilizing SOM neural network to calculate the individual FV of the image spatial frequency function
  • Utilizing the DWT method for sharpness measuring
Uses a complex algorithm[7]
Jeon et al.
  • Fully digital autofocusing
  • Fast and precise autofocusing
  • Cooperation with point spread function
Has a high-cost image capturing system[31]
Koh et al.
  • Adopting two low-pass filters and double apertures
  • Position variance of gradient magnitude (VGM)
Requires a complex algorithm to deal with blurred images[32]
Yamada et al.
  • High-power zoom lens
  • Optical difference by utilizing a switch prism
Requires a high-cost and precise positioning for image capturing[33]
Wang et al.
  • Response intensity of dual near-focusing position
Without retrieving the information of moving direction near the focal plane[41]
Table 2. Devices used in our proposed system.
Table 2. Devices used in our proposed system.
Key PartsDevice
Laser Light SourceLaser diode (Thorlabs HL6501MG)
CollimatorThorlabs LT110P-A (f = 6.24 mm)
Thorlabs LT240P-A (f = 8 mm)
Beam Splitter (BS) Thorlabs CM1-BS013 (50:50)
Polarized Beam Splitter (PBS)Thorlabs CM1-PBS251
ObjectiveOlympus Co. (f = 18 mm)
Focusing LensThorlabs AC254-100A
Thorlabs AC254-200A
PinholesThorlabs P75S
Thorlabs P300S
Thorlabs P400S
PDThorlabs PDA100A
Co-axial VisionNavitar 1-6030, 1-60255
MotorNewport ILS-250HA
Table 3. Our simulation of the optimal parameters for the rear focal point.
Table 3. Our simulation of the optimal parameters for the rear focal point.
SETEFL (mm)Pinhole Position D
(mm)
Pinhole Size ψ (μm)ΔX1 (mm)Slope 1ΔX2 (mm)Slope 2
1100113.5601.4364.1789.6180.624
2701.4654.0959.3610.641
3751.4934.01910.3420.58
4801.5183.95211.3140.53
5901.5813.79513.5830.442
7113.5751.4654.0959.3610.641
81141.5273.9298.9030.674
SETEFL (mm)Pinhole Position D
(mm)
Pinhole Size ψ (μm)ΔX1 (mm)Slope 1ΔX2 (mm)Slope 2
10200273.51401.4314.1927.3230.819
111501.4534.1298.1810.733
121601.484.0559.490.632
131701.4974.0079.3560.641
141801.5173.95610.0610.596
151901.5393.910.9770.547
162001.5573.85510.740.559
172101.5783.80311.1160.54
182201.6093.72813.0020.461
20270.51601.4294.19711.1350.539
212711.434.1959.9590.602
22271.51.444.1689.770.614
232721.4424.1618.80.682
24272.51.4514.1348.6650.692
252731.4654.0968.8960.674
26273.51.484.0559.490.632
272741.4874.0359.1420.656
28274.51.4934.0198.3070.722
292751.5043.9888.4010.714
30275.51.5123.9698.130.738
312761.5173.9557.9040.759

Share and Cite

MDPI and ACS Style

Jan, C.-M.; Liu, C.-S.; Yang, J.-Y. Implementation and Optimization of a Dual-confocal Autofocusing System. Sensors 2020, 20, 3479. https://doi.org/10.3390/s20123479

AMA Style

Jan C-M, Liu C-S, Yang J-Y. Implementation and Optimization of a Dual-confocal Autofocusing System. Sensors. 2020; 20(12):3479. https://doi.org/10.3390/s20123479

Chicago/Turabian Style

Jan, Chia-Ming, Chien-Sheng Liu, and Jyun-Yi Yang. 2020. "Implementation and Optimization of a Dual-confocal Autofocusing System" Sensors 20, no. 12: 3479. https://doi.org/10.3390/s20123479

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop