Next Article in Journal
Terahertz Modulation of Silicon-Based Lead-Free Small-Bandgap Cs2CuSbCl6 Double Perovskite Nanocrystals
Previous Article in Journal
Accurate Automatic Object Identification Under Complex Lighting Conditions via AI Vision on Enhanced Infrared Polarization Images
error_outline You can access the new MDPI.com website here. Explore and share your feedback with us.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design of a Monocentric Multiscale Optical System for Near-Diffraction-Limited Imaging with High Resolution and Large Field of View

1
School of Physics and Electronic Information, Yan’an University, Yan’an 716000, China
2
Key Laboratory of Advanced Optoelectronic Materials and Devices of Higher Education Institutions in Shaanxi, Yan’an University, Yan’an 716000, China
*
Author to whom correspondence should be addressed.
Optics 2026, 7(1), 4; https://doi.org/10.3390/opt7010004
Submission received: 3 December 2025 / Revised: 23 December 2025 / Accepted: 31 December 2025 / Published: 4 January 2026
(This article belongs to the Section Engineering Optics)

Abstract

Multiscale optical imaging is expected to address the trade-off between field of view (FOV) and resolution in optical systems. To achieve high resolution imaging with a large FOV, this study employs a double-layer monocentric lens to design the front-stage objective lens and utilizes multiple relay lenses for the secondary system. The design results demonstrate that the RMS value of the image spot size across the full FOV is controlled within 2 μm, and the system’s optical modulation transfer function (MTF) across the full FOV approaches the diffraction limit. Specifically, the MTF values across the full FOV exceed 0.35 at the cutoff frequency of 250 lp/mm. The designed optical system features a simple structure and high imaging quality. When a larger number of secondary relay imaging systems are employed, it is capable of achieving a large FOV with high resolution imaging performance, as required by the optical system. Moreover, it holds significant application potential in wide-area, large-range imaging and related fields.

1. Introduction

In the rapidly advancing technological era, optical imaging plays an indispensable role in various fields. The scope of optical imaging applications spans a wide range of fields, including medical diagnosis, geological exploration, aerospace, and military reconnaissance [1,2,3,4,5]. With the continuous advancement of science and technology, the demand for imaging with higher resolution, larger FOV, and greater detail is increasing steadily [6,7,8]. When designing and utilizing optical imaging systems, researchers often face a fundamental challenge: how to balance imaging resolution and FOV. Higher resolution enables the system to capture more details and smaller features, thereby providing more accurate information and data. However, achieving high resolution often requires sacrificing the size of the FOV [9,10,11].
Imaging FOV represents the size of the entire spatial range or scene that an optical system can observe. An imaging system with a large FOV can cover a larger area, capturing a broader range of information. However, increasing FOV often results in a reduction in resolution. This is because within a limited optical system, focusing on a broader scene means dispersing limited light resources over a wider area, which affects resolution. Therefore, there is an inherent trade-off between resolution and FOV. Improving resolution often requires sacrificing FOV size, and imaging techniques with a large FOV may perform poorly in terms of resolution.
In recent years, many researchers have explored various technical methods for achieving large FOV and high resolution imaging. For example, Kopf team and Sargent team use a single high resolution SLR camera to scan and capture a large number of photos at a specific location [12,13], which are then stitched together to obtain high resolution images with a large FOV, consisting of billions of pixels. However, there is a time interval between the frames of the scanned images, making this method suitable only for static scenes. Wilburnden [14] proposed a large-scale imaging system composed of 96 cameras, each equipped with an individual data processor. By arranging the cameras in space and adjusting the camera’s FOV, a series of images with different but overlapping perspectives are simultaneously obtained, enabling the reconstruction of high resolution images with a large FOV. Cossairt from Columbia University employed computational imaging methods to achieve high resolution imaging in small camera sizes, while using spherical lenses and detectors to achieve large FOV imaging [15]. However, their designed system exhibits significant spherical and chromatic aberration, which cannot be completely eliminated by computational imaging. Suntharalingam achieves high resolution imaging with a large FOV by concatenating multiple detectors to create a continuous larger detector [16]. Although this detector is suitable for single-exposure image acquisition applications, it suffers from stitching gaps and blind spots in the captured images. The use of fisheye lenses can expand the FOV at a given resolution [17,18]. However, there is distortion in the imaging surface, making it impossible to achieve consistent resolution across the entire field, particularly in the edge regions, where compression is most severe, leading to significant information loss. Overcoming the contradiction between resolution and FOV, as mentioned above, presents certain challenges. In comparison, the multi-scale imaging design proposed by Brady [19] from Duke University offers significant advantages, which are thoroughly discussed in the following section.

2. Multi-Scale Imaging System

Multiscale optical imaging technology seeks to integrate imaging capabilities at different scales within a single system to achieve comprehensive observation and analysis of the scene. The development of this technology has significantly broadened the application areas of optical imaging, offering novel solutions for a variety of critical fields. Multiscale optical imaging is a multi-level imaging system that is different from traditional optical systems. Currently, multiscale optical systems consist of two levels: a large optical system at the front level and multiple small relay systems at the secondary level, as shown in Figure 1 The front-end large-sized optical system is used to collect as much optical information as possible and perform preliminary aberration correction. The secondary multiple small-sized optical systems form a multi aperture relay imaging array structure, which is used to segment and relay the intermediate image formed by the front-end optical system, thus completing local residual aberration correction. This design integrates the FOV acquisition capability of large-sized optical systems and the local aberration correction capability of small-sized multi aperture relay imaging arrays. Together, this configuration offers strong aberration correction capability, ensuring a large FOV while maintaining high resolution imaging.
Although multiscale optical design offers significant advantages, when traditional lenses are used as the primary objective lens of the system, the aberrations generated by different FOV positions of the primary objective lens are different. This results in the need for different and numerous secondary relay systems to correct the local aberrations. As a consequence, the design and manufacturing processes become more complex.
In order to ensure that the local aberrations corrected by secondary relay imaging systems at different FOV positions are consistent, and that the secondary relay imaging systems are fully symmetrical, Brady et al. proposed a multi-scale monocentric optical imaging system [20,21,22,23,24,25,26]. The multi-scale monocentric optical system effectively addresses the issue of varying aberrations at different field-of-view positions, which require different secondary relay systems and an asymmetric arrangement for aberration correction. Due to the rotational symmetry of the FOV provided by the spherical primary lens, a larger FOV can be achieved, with the resulting intermediate image plane being spherical. The aberrations at different positions on the intermediate image plane are identical. Therefore, the secondary relay system can be completely symmetric and arranged uniformly on the sphere behind the intermediate image plane. This advantage significantly simplifies the optical system’s structure, reduces processing costs and assembly difficulties, and represents the most effective method for achieving both a large FOV and high resolution imaging simultaneously. The schematic diagram of the multi-scale monocentric optical imaging system is shown in Figure 2.
In order to achieve both a large FOV and high resolution imaging, this study investigates the principle of multi-scale monocentric imaging and designs a multi-scale monocentric optical system for imaging in the visible light band. The front group of the system adopts a double-layer monocentric objective, and the initial structural parameters of the double-layer monocentric objective are calculated through aberration theory. After obtaining the initial structural parameters, the structure can then be further optimized using optical design software.

3. Initial Structural Calculation of Multi-Scale Monocentric Imaging System

The multi-scale monocentric imaging system consists of a large spherically symmetric objective and a set of identical secondary relay imaging system. The spherically symmetric objective is used for the initial imaging of wide-area scenes, generating a curved intermediate image plane. The secondary relay imaging system divides the large FOV. There is overlap between the segmented fields of view, which is used for later field stitching. Each secondary relay imaging system further corrects the residual aberrations of the primary objective lens and performs relay imaging for the intermediate image within their respective FOV channels. Finally, it can be stitched through sub images, achieving the high resolution imaging of large FOV.
Due to the identical secondary relay imaging systems, we focus only on the system model formed by the primary objective lens and a single secondary relay imaging system. The residual aberration of the monocentric objective can be compensated and corrected by the secondary relay imaging system. If the residual aberration is large, it will increase the optical complexity of the relay system in correcting the residual aberration and, additionally, increase the tolerance sensitivity of the system, leading to higher assembly costs. Therefore, we should first design the monocentric objective independently to eliminate most of the aberrations and then optimize the entire design by incorporating the secondary relay lens to further correct the residual aberrations in the system.
The primary objective lens of the multi-scale monocentric imaging system employs a spherically symmetric optical system, which consists of a transparent spherical lens and a series of concentric spherical shells. As the number of spherical shells increases, the degree of freedom in the design also increases, allowing for better optimization of various parameters to enhance system imaging quality. However, this also increases the design complexity. Taking all factors into consideration, the primary objective lens designed in this study is a double-layer monocentric spherical lens, as shown in Figure 3.
According to the theory of paraxial optics, the formula for calculating the focal length of the double-layer monocentric spherical lens can be derived, as shown in Equation (1), where n1 and n2 are the refractive indices of the double-layer spherical lens, and r1 and r2 are the curvature radii of the double-layer spherical lens.
f = 1 2 r 1 r 2 n 1 n 2 r 2 ( n 1 1 ) n 2 + r 1 ( n 2 n 1 )
The primary aberration of a double-layer monocentric spherical lens is spherical aberration. Using the theory of optical aberration, the formula for eliminating spherical aberration in a double-layer spherical objective is derived, as shown in Equation (2) [27].
1 f 3 = 2 r 1 3 ( 1 1 n 1 3 ) + 2 r 2 3 ( 1 n 1 3 1 n 2 3 )
Therefore, we can combine Equations (1) and (2) to calculate the initial structure of the double-layer spherical objective, i.e., the initial values of r1 and r2. As a design example, we consider a double-layer glued monocentric spherical lens system with a target focal length of f = 100 mm. The outer glass material selected is heavy flint glass H-ZF3 from the CDGM optical glass library, which has a high refractive index of n = 1.71736. The inner glass material selected is Crown glass H-QK3L from the CDGM optical glass library, which has a low refractive index of n = 1.48749. Substituting the above parameters into the focal length Equation (1) and the extinction spherical aberration condition Equation (2), we obtain r1 = 67.0787 mm and r2 = 36.6632 mm. Therefore, the thickness of the double-layer spherical lens is d1 = r1r2 = 30.4155 mm, and d2 = r2 = 36.6632 mm, respectively. Then the two calculated radius of curvature values and thickness values were entered into the optical design software, and the imaging wavelength was set to the visible band, 486 nm to 656 nm. By substituting the obtained parameters r1 = 67.0787 mm, r2 = 36.6632 mm, d1 = 30.4155 mm, d2 = 36.6632 mm into the optical design software, the imaging optical path of the monocentric primary objective was obtained, as shown in Figure 4. It can be observed that the angle of light passing through the double-layer objective is relatively smooth. From the image position, it can be seen that the rays are well focused at the image plane, indicating that the initial parameters obtained from the aberration theory serve as a good starting point for optimization.
The double-layer spherical lens has corrected some aberrations, but the residual aberrations especially the field curvature need further correction by the subsequent relay lenses. Due to the subsequent relay lens group taking on the role of residual aberration correction and relay imaging, the design of the relay lens group is more complex and cannot be directly solved using analytical methods, as with the monocentric objective. Instead, it is necessary to select a suitable initial structure from existing patent or lens libraries, and then optimize it in the later stage. An endoscopic configuration can indeed be employed as an initial structure to achieve near-field relay imaging. However, compared with endoscopic systems, the double-Gauss configuration offers greater design flexibility and advantages such as larger aperture. As a result, the double-Gauss lenses are still more commonly adopted in practical optical engineering. For example, the AWARE10 camera use double-Gauss lenses as the initial structure for relay imaging lenses [28]. Therefore, we select the classic double-Gaussian objective as the initial structure of the secondary relay imaging system, as shown in Figure 5. The selected double-Gaussian lens consists of six lenses, featuring a compact and symmetrical overall structure. The aperture stop is located between the third and fourth lenses, which facilitates the attainment of a larger relative aperture and imaging FOV.

4. System Structures for Optimized Design

Combine the double-layer spherical lens shown in Figure 4 with the six-piece double-Gaussian lenses shown in Figure 5 for further joint optimization and design. The optimization steps involve scaling the lens sizes of the secondary relay system, optimizing the radii of curvature and thickness of each lens in both the primary and secondary relay systems, optimizing the inter-lens distances, optimizing the replacement of glass materials, and the position of the aperture. Through repeated iterative optimization, we ultimately obtained a design that satisfies all requirements. The imaging optical path of the final design is illustrated in Figure 6. The design wavelengths are 450 to 650 nm for visible wavelengths. Some other information about the optical system include an F-number of 3, a focal length of 55.8 mm, and a field of view of ±1.5°.
The spot diagrams are presented in Figure 7. As shown in Figure 7, the FOV of a single secondary imaging system is ±1.5°, corresponding to a total FOV of 3°. Such a relatively small FOV implies that a large number of secondary imaging systems are required to realize a large FOV system. However, this configuration also offers important advantages. On the one hand, a small FOV is more favorable for aberration correction. On the other hand, stitching a large number of imaging channels makes it possible to achieve high resolution imaging with a very large number of pixels. In an ideal case, without considering the field overlap required for stitching, the required number of secondary imaging systems is approximately equal to the total desired FOV divided by the FOV of a single channel. Although employing a large number of secondary imaging systems enables large field, high resolution imaging, it also introduces significant challenges, including increased complexity in field stitching, massive image data processing, and difficulties in system assembly and alignment. Therefore, the number of secondary imaging systems should be carefully designed and optimized according to practical requirements and engineering constraints.
From Figure 7, it can be observed that all imaging spots are compact, with the RMS value of the imaging spot at the central FOV being 1.8 μm, and the RMS value at the maximum FOV being 2 μm. The energy of the imaging spots is well concentrated, and the imaging spots are compact, demonstrating that the optical system effectively corrects aberrations, including spherical aberration, astigmatism, field curvature, and chromatic aberration. The number of pixels in an optical system detector can describe the resolution of the optical system, which represents the amount of information the optical system can capture. The rays in these spot diagrams fall almost within the Airy disc radius, showing near diffraction-limited performance.
In practical imaging systems, the size of the imaging spot should be properly matched to the pixel size of the detector. In our work, the analysis is based on a theoretical model, in which we assume that the imaging spot size is already well matched to the detector. Under this assumption, the formula for calculating the resolution of an optical system is given by n = A/S, where A is the area of the imaging sensor and S is the pixel size of the imaging sensor. According to the formula for calculating the resolution of an optical system, the smaller the imaging spot of the optical system, the more advantageous it is to improve the imaging resolution of the optical system.
In this paper, the maximum RMS value of the imaging spot size in the designed optical system is constrained to 2 μm. Thus, a smaller pixel size imaging sensor can be selected, which is highly advantageous for enhancing the imaging resolution of the optical system. In addition, when multiple detectors are used, the imaging resolution is significantly improved and can achieve hundreds of millions of pixels, satisfying the imaging requirements for high resolution and wide-area imaging. It can be widely applied in various scenarios, such as city squares photography, station surveillance, live sports broadcasts and so on.
The modulation transfer function (MTF) of an optical system is one of the most effective methods for characterizing the imaging quality of an optical system. For a diffraction-limited optical imaging system, the MTF calculation is given by Equation (3) [29].
MTF d i f f f x = 2 π cos 1 f x f o c o f x f o c o 1 f x f o c o 2
When an optical system has aberrations, the expression for the MTF due to aberrations is shown in Equation (4) [30], where Wrms represents the root mean square value of the wavefront aberration, fx represents the spatial frequency (in cycles/mrad), and foco represents the optical cutoff frequency, foco = D/λ. The empirical value of A is a constant, A = 0.18.
MTF a b e r r a t i o n f x 1 W r m s A 2 1 4 f x f o c o 1 2 2
From the concept of optical system transfer function, it can be obtained that the MTF of an actual optical system is expressed as the product of the diffraction-limited MTFdiff and the aberration MTFaberration, as shown in Equation (5). By substituting Equations (3) and (4) into Equation (5), the MTF of the actual optical system can be calculated.
MTF o p t i c s = MTF d i f f × MTF a b e r r a t i o n
The MTF curves for various wavefront aberrations are presented in Figure 8. As shown in Figure 8, as the wavefront aberration increases, the MTF of the optical system decreases significantly, deviating further from the optical transfer function curve of the diffraction limit.
Figure 9 illustrates the MTF of the monocentric multiscale imaging system designed in this paper. As depicted in Figure 9, the MTF curves for each FOV are consistent and closely follow the diffraction limit curve. By comparing the MTF curves in Figure 8 and Figure 9, it is clear that the optical system designed in this paper effectively corrects the wavefront aberrations. When a detector with a pixel size of 2 μm is selected, it is calculated that the maximum resolution of the detector is 250 lp/mm. Thus, the MTF cutoff frequency can be set to 250 lp/mm. As shown in Figure 9, at the cutoff frequency, the transfer function values for all fields of view are greater than 0.35, and the MTF of each FOV is close to the diffraction limit, indicating that the imaging performance of the optical system is excellent.

5. Conclusions

Multi-scale optical design, especially monocentric multi-scale optical design, is currently considered one of the most effective approaches to achieving large FOV and high resolution imaging. This design integrates the large FOV capability of large-scale optical systems with the local FOV aberration correction capability of small-scale multi-aperture imaging arrays. In this study, we derive the formula for eliminating spherical aberration of a double-layer monocentric objective lens. Subsequently, the initial structure of the double-layer monocentric objective lens is calculated using the spherical aberration elimination formula. Lastly, the integration of the front double-layer monocentric objective lens and the secondary relay imaging module is optimized using optical design software. The optical system designed in this study demonstrates excellent imaging performance. The spot size across the full FOV is below 2 µm, and the MTF curves of each FOV are closed to the diffraction limit curve. In addition, the transfer function values of all fields of view are greater than 0.35 at the cutoff frequency, indicating excellent imaging performance.

Author Contributions

Conceptualization, X.W.; methodology, X.W. and Y.Y.; software, X.W.; Writing—original draft preparation, Z.H.; writing—review and editing, X.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Startup Foundation for doctors of Yan’an University, YDBK2022-72.

Data Availability Statement

The data that support the findings of this study are available from the corresponding authors upon reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Ma, M.; Guo, F.; Cao, Z.; Wang, K. Development of an artificial compound eye system for three-dimensional object detection. Appl. Opt. 2014, 53, 1166–1172. [Google Scholar] [CrossRef] [PubMed]
  2. Wang, Z.; Huang, M.; Qian, L.; Sun, Y.; Lu, X.; Zhao, W.; Zhang, Z.; Wang, G.; Zhao, Y. Near-Space Wide-Area and High-Resolution Imaging System Design and Implementation. Sensors 2023, 23, 6454. [Google Scholar] [CrossRef]
  3. Balz, T.; Rocca, F. Reproducibility and replicability in SAR remote sensing. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 3834–3843. [Google Scholar] [CrossRef]
  4. Wang, Z.; Kang, Q.; Xun, Y.; Shen, Z.; Cui, C. Military reconnaissance application of high-resolution optical satellite remote sensing. In Proceedings of the International Symposium on Optoelectronic Technology and Application 2014: Optical Remote Sensing Technology and Applications, Beijing, China, 13–15 May 2014; SPIE: Bellingham, WA, USA; Volume 9299, pp. 301–305. [Google Scholar]
  5. Zheng, G.; Horstmeyer, R.; Yang, C. Wide-field, high-resolution Fourier ptychographic microscopy. Nat. Photonics 2013, 7, 739–745. [Google Scholar] [CrossRef]
  6. Peng, L.; Bian, L.; Liu, T.; Zhang, J. Agile wide-field imaging with selective high resolution. Opt. Express 2021, 29, 35602–35612. [Google Scholar] [CrossRef]
  7. Wang, X.; Liu, C.; Qiao, L.; Zhou, J.; Bai, Y.; Sun, J. Imaging with high resolution and wide field of view based on an ultrathin microlens array. Phys. Rev. Appl. 2024, 21, 034035. [Google Scholar] [CrossRef]
  8. Zhang, Q.; Pan, D.; Ji, N. High-resolution in vivo optical-sectioning widefield microendoscopy. Optica 2020, 7, 1287–1290. [Google Scholar] [CrossRef]
  9. Deng, H.; Li, S.; Wang, L.; Xing, Y.; Wang, Q.-H. Dual-view integral imaging system with wide viewing angle and high spatial resolution. IEEE Photonics J. 2020, 12, 1–11. [Google Scholar] [CrossRef]
  10. Wang, X.; Zhang, X.; Zhu, Y.; Guo, Y.; Yuan, X.; Xiang, L.; Wang, Z.; Ding, G.; Brady, D.; Dai, Q.; et al. Panda: A gigapixel-level human-centric video dataset. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 3268–3278. [Google Scholar]
  11. Lohmann, A.W. Scaling laws for lens systems. Appl. Opt. 1989, 28, 4996–4998. [Google Scholar] [CrossRef] [PubMed]
  12. Kopf, J.; Uyttendaele, M.; Deussen, O.; Cohen, M.F. Capturing and Viewing Gigapixel Images. ACM Trans. Graph. 2007, 26, 93.1–93.8+93.10. [Google Scholar] [CrossRef]
  13. Sargent, R.; Bartley, C.; Dille, P.; Kellerm, J.; Nourbakhsh, I. Timelapse GigaPan: Capturing, sharing, and exploring timelapse gigapixel imagery. In Proceedings of the Fine International Conference on Gigapixel Imaging for Science, Pittsburgh, PA, USA, 11–13 November 2010; p. 1. [Google Scholar]
  14. Wilburn, B.; Joshi, N.; Vaish, V.; Talvala, E.-V.; Antunez, E.; Barth, A.; Adams, A.; Horowitz, M.; Levoy, M. High performance imaging using large camera arrays. In Proceedings of the ACM SIGGRAPH 2005 Papers; ACM: New York, NY, USA, 2005; pp. 765–776. [Google Scholar]
  15. Cossairt, O.S.; Miau, D.; Nayar, S.K. Gigapixel computational imaging. In Proceedings of the 2011 IEEE International Conference on Computational Photography (ICCP), Pittsburgh, PA, USA, 8–10 April 2011; IEEE: New York, NY, USA, 2011; pp. 1–8. [Google Scholar]
  16. Suntharalingam, V.; Berger, R.; Clark, S.; Knecht, J.; Messier, A.; Newcomb, K.; Rathman, D.; Slattery, R.; Soares, A.; Stevenson, C.; et al. A 4-side tileable back illuminated 3D-integrated Mpixel CMOS image sensor. In Proceedings of the 2009 IEEE International Solid-State Circuits Conference—Digest of Technical Papers, San Francisco, CA, USA, 8–12 February 2009; IEEE: New York, NY, USA, 2009. [Google Scholar]
  17. Xie, L.; Zhang, X.; Tu, D. Underwater large field of view 3D imaging based on fisheye lens. Opt. Commun. 2022, 511, 127975. [Google Scholar] [CrossRef]
  18. Pernechele, C.; Consolaro, L.; Jones, G.H.; Brydon, G.; Da Deppo, V. Telecentric F-theta fisheye lens for space applications. OSA Contin. 2021, 4, 783–789. [Google Scholar] [CrossRef]
  19. Brady, D.J.; Hagen, N. Multiscale lens design. Opt. Express 2009, 17, 10659–10674. [Google Scholar] [CrossRef]
  20. Brady, D.J.; Gehm, M.E.; Stack, R.A.; Marks, D.L.; Kittle, D.S.; Golish, D.R.; Vera, E.M.; Feller, S.D. Multiscale gigapixel photography. Nature 2012, 486, 386–389. [Google Scholar] [CrossRef] [PubMed]
  21. Tremblay, E.J.; Marks, D.L.; Brady, D.J.; Ford, J.E. Design and scaling of monocentric multiscale imagers. Appl. Opt. 2012, 51, 4691–4702. [Google Scholar] [CrossRef]
  22. Pang, W.; Brady, D.J. Galilean monocentric multiscale optical systems. Opt. Express 2017, 25, 20332–20339. [Google Scholar] [CrossRef]
  23. Huang, Y.H.; Fu, Y.G.; Zhang, G.Y.; Liu, Z. Modeling and analysis of a monocentric multi-scale optical system. Opt. Express 2020, 28, 32657–32675. [Google Scholar] [CrossRef]
  24. Liu, Z.; Liu, S.; Huang, Y.; Jin, S. Crosstalk in monocentric multiscale systems based on an internal stray light stop suppression method. Appl. Opt. 2024, 63, 1445–1456. [Google Scholar] [CrossRef]
  25. Yan, A.; Dong, S.; Wu, D. Optical design of monocentric multiscale three-line array airborne mapping camera. In Proceedings of the Seventh Asia Pacific Conference on Optics Manufacture and 2021 International Forum of Young Scientists on Advanced Optical Manufacturing (APCOM and YSAOM 2021), Shanghai, China, 28–31 October 2021; SPIE: Bellingham, WA, USA, 2022; Volume 12166, pp. 516–521. [Google Scholar]
  26. Son, H.S.; Johnson, A.; Stack, R.A.; Shaw, J.M.; McLaughlin, P.; Marks, D.L.; Brady, D.J.; Kim, J. Optomechanical design of multiscale gigapixel digital camera. Appl. Opt. 2013, 52, 1541–1549. [Google Scholar] [CrossRef] [PubMed]
  27. Marks, D.L.; Son, H.S.; Kim, J.; Brady, D.J. Engineering a gigapixel monocentric multiscale camera. Opt. Eng. 2012, 51, 083202. [Google Scholar] [CrossRef]
  28. Marks, D.L.; Llull, P.R.; Phillips, Z.; Anderson, J.G.; Feller, S.D.; Vera, E.M.; Son, H.S.; Youn, S.-H.; Kim, J.; Gehm, M.E.; et al. Characterization of the AWARE 10 two-gigapixel wide-field-of-view visible imager. Appl. Opt. 2014, 53, C54–C63. [Google Scholar] [CrossRef]
  29. Hardie, R.C.; LeMaster, D.A.; Ratliff, B.M. Super-resolution for imagery from integrated microgrid polarimeters. Opt. Express 2011, 19, 12937–12960. [Google Scholar] [CrossRef] [PubMed]
  30. Holst, G.C. Electro-Optical Imaging System Performance, 5th ed.; SPIE Press: Bellingham, WA, USA, 2008; p. 84. [Google Scholar]
Figure 1. Schematic diagram of a double-level optical system.
Figure 1. Schematic diagram of a double-level optical system.
Optics 07 00004 g001
Figure 2. Schematic illustration of a multiscale imaging system.
Figure 2. Schematic illustration of a multiscale imaging system.
Optics 07 00004 g002
Figure 3. Double-layer monocentric spherical lens.
Figure 3. Double-layer monocentric spherical lens.
Optics 07 00004 g003
Figure 4. Layout of the double-layer monocentric objective lens.
Figure 4. Layout of the double-layer monocentric objective lens.
Optics 07 00004 g004
Figure 5. Layout of the double-Gaussian lens.
Figure 5. Layout of the double-Gaussian lens.
Optics 07 00004 g005
Figure 6. Layout of the optimized lens.
Figure 6. Layout of the optimized lens.
Optics 07 00004 g006
Figure 7. Spot diagrams of the optimized lens.
Figure 7. Spot diagrams of the optimized lens.
Optics 07 00004 g007
Figure 8. MTF curve for different wavefront aberrations of an actual optical system.
Figure 8. MTF curve for different wavefront aberrations of an actual optical system.
Optics 07 00004 g008
Figure 9. MTF of the designed lens system.
Figure 9. MTF of the designed lens system.
Optics 07 00004 g009
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wu, X.; Yang, Y.; He, Z. Design of a Monocentric Multiscale Optical System for Near-Diffraction-Limited Imaging with High Resolution and Large Field of View. Optics 2026, 7, 4. https://doi.org/10.3390/opt7010004

AMA Style

Wu X, Yang Y, He Z. Design of a Monocentric Multiscale Optical System for Near-Diffraction-Limited Imaging with High Resolution and Large Field of View. Optics. 2026; 7(1):4. https://doi.org/10.3390/opt7010004

Chicago/Turabian Style

Wu, Xiongxiong, Yanning Yang, and Zhihui He. 2026. "Design of a Monocentric Multiscale Optical System for Near-Diffraction-Limited Imaging with High Resolution and Large Field of View" Optics 7, no. 1: 4. https://doi.org/10.3390/opt7010004

APA Style

Wu, X., Yang, Y., & He, Z. (2026). Design of a Monocentric Multiscale Optical System for Near-Diffraction-Limited Imaging with High Resolution and Large Field of View. Optics, 7(1), 4. https://doi.org/10.3390/opt7010004

Article Metrics

Back to TopTop