Next Article in Journal
Electromagnetic–Thermal Coupling and Optimization Compensation for Missile-Borne Active Phased Array Antenna
Previous Article in Journal
Automatic Detection of TiO2 Nanoparticles Using Dual-Coupled Microresonators and Deep Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Data-Driven Interactive Lens Control System Based on Dielectric Elastomer

1
School of Mechanical and Electronic Engineering, Nanjing Forestry University, Nanjing 210037, China
2
School of Mechanical Engineering, Southeast University, Nanjing 211189, China
*
Authors to whom correspondence should be addressed.
Technologies 2026, 14(1), 68; https://doi.org/10.3390/technologies14010068
Submission received: 16 December 2025 / Revised: 26 December 2025 / Accepted: 13 January 2026 / Published: 16 January 2026
(This article belongs to the Special Issue AI Driven Sensors and Their Applications)

Abstract

In order to solve the dynamic analysis and interactive imaging control problems in the deformation process of bionic soft lenses, dielectric elastomer (DE) actuators are separated from a convex lens, and data-driven eye-controlled motion technology is investigated. According to the DE properties, which are consistent with the deformation characteristics of hydrogel electrodes, the motion and deformation effect of eye-controlled lenses under film prestretching, lens size, and driving voltage, is studied. The results show that when the driving voltage increases to 7.8 kV, the focal length of the lens, whose prestretching λ is 4, and the diameter d is 1 cm, varies in the range of 49.7 mm and 112.5 mm. And the maximum focal-length change could reach 58.9%. In the process of eye controlling design and experimental verification, a high DC voltage supply was programmed, and eye movement signals for controlling the lens were analyzed by MATLAB software (R2023b). Eye-controlled interactive real-time motion and tunable imaging of the lens were realized. The response efficiency of soft lenses could reach over 93%. The adaptive lens system developed in this research has the potential to be applied to medical rehabilitation, exploration, augmented reality (AR), and virtual reality (VR) in the future.

1. Introduction

Among all kinds of information that the human sensory system can accept, visual information occupies a crucial position (about 83%) in the five main senses (vision, hearing, taste, smell, and touch). In the historical process, people’s increasing demands for visual information acquisition and processing technology have promoted continuous updating and iteration of optical instruments. They gradually evolve from fixed focal-length systems to tunable ones [1].
At present, there are two common tunable methods for optical devices on the market. One is digital zoom, and the other is optical zoom. Digital zoom is based on software through the built-in graphics processor, which fills the entire picture with interpolation or other algorithms on some pixels of the image sensor, such as the charge-coupled device (CCD) and the complementary metal oxide semiconductor (CMOS). Specific processing means are usually required to improve the serious loss of image quality, as far as possible, to make up for the quality gap with the original image. Optical zoom is based on hardware or manual regulating mechanism [2], the mechanical device may be utilized to adjust the relative position of lens groups, so that the focal-length adjustment of the system is achieved.
In some special scenarios where there are strict limits on the size of the optical system, high-quality and efficient capabilities are desired, such as in physical treatment [3], robot vision, integrated circuit lithography machines, and smartphones. Inspired by the tunable principle of human eyes, a new kind of soft lenses has gradually won researchers’ attention in recent years. Unlike a traditional tunable system, such lenses generally do not need to use any mechanical moving parts. The focal length can be adjusted by changing the curvature shapes of the lens surfaces or the refractive index of the medium material. This kind of bionic lens has the characteristics of compact structure, low power consumption, and fast response. They have huge market application value.
With the development of robot technology, intelligent robots are coming out, which interact with nature, adapt to complex environments independently, and work cooperatively. One of the important application fields is the human–machine interface (HMI) and interaction. At present, a variety of traditional robot interactions based on hard materials have been developed for different needs. However, the field of interaction between soft machines and humans remains to be developed and deeply explored [4,5,6,7].
Human eye movements are divided into the following: ① saccade—focus moving immediately from one point to another, ② moving smoothly—observing the changing objects slowly, ③ compensating movement—keeping the focus relatively fixed, ④ divergent motion—moving to maintain stereoscopic vision. Living cells, tissues, or organs are often accompanied by electrical changes. Bioelectrical signals typically have the characteristics of weak (ranging from microvolts to millivolts), low frequency (0.05–100 Hz), and are easily disturbed. So far, electroencephalogram (EEG) and electrocardiogram (ECG) signals have been widely used in medical and scientific research [8].
Similarly, measurable potential changes in eye movements are called electro-oculograms (EOG). EOG is the easiest of all bioelectrical signals to measure. Its signal amplitude is from 15 μV to 200 μV, and in the case of a relatively fixed head position, eye signal interference is also small. Because the frequency difference between an EOG signal and the noise is obvious, the noise can be eliminated by a simple filtering method. EOG can be utilized to monitor eye movement up to 70° from the central point with an accuracy of 2°. EOG signals have been successfully applied to multi-degree-of-freedom motion control of wheelchairs and robots [9,10,11].
In terms of soft tunable lens materials, the electroactive polymer (EAP) is more common. It is a collective name for a new soft active material (SAM), and can efficiently convert electric field excitation into mechanical response. It is often referred to as ‘artificial muscle’ because of its good biocompatibility [12,13,14]. According to the actuating principle, EAPs can be divided into two categories: ionic EAPs, driven by ion migration or diffusion, and electronic EAPs, driven by applied electric fields and electrostatic forces. On one hand, ionic EAPs have lower driving voltages than electronic EAPs [15]. On the other hand, electronic EAPs can produce greater strains than ionic EAPs. The dielectric elastomer (DE), as an electronic EAP, is the preparation material for common bionic devices [16,17].
In the deformation theory of the DE material, Suo [18] developed the uniform field theory and established the dielectric elastic force-electric energy transformation model based on the state of thermodynamic equilibrium. The model was applied to the analysis of vacuum (elastic dielectric with lost stiffness), incompressible materials, ideal DE, electro-constrictive materials, and nonlinear dielectric materials. The dissipative processes, such as viscoelasticity and dielectric relaxation, were studied by non-equilibrium thermodynamics. These models were recognized as the fundamental theory of DEs and remained alive in continuous innovation.
Carpi et al. [19] proposed a lens structure based on DE. Two radially pre-stretched DE films enclosed the liquid, and then a flexible chamber formed a convex lens. The tunable function was realized by a circular DE actuator (DEA) around the lens. When the initial diameter of the lens was 7.6 mm, the driving voltages increased from 0 to 3.5 kV, and its focal length changed from 22.73 to 16.72 mm. However, because the electrode material was coated on the optical path, it posed a challenge to the lens transparency. Wang et al. [20] prepared an all-solid-state tunable soft lens and water-based polyurethane (PEDOT) transparent electrodes. The experimental result showed that the focal-length change achieved 209%. The lens was not disturbed by the vibration and was insensitive to the placement direction.
Lau et al. [21] designed a soft lens with a diameter of 8 mm; the DE film thickness was reduced by adopting a stacked arrangement, which was a benefit in reducing the actuating voltage. It realized focusing objects from 15 to 50 cm at the voltage range of 0 to 1.8 kV. Zhong et al. [22] proposed a soft tunable lens based on an ionically polyelectrolyte elastomer, which achieved the focal-length change of 46.4% under voltage, superior to that of the human eye. Yin et al. [23] presented a lens based on the DEA, and it was improved by an origami mechanism with a modified Yoshimura tessellation structure. The focal length of the 10 mm diameter solid lens showed a tunability of 17%. Current studies mostly focus on the design of soft lenses, while relevant theories and data on interaction technology are relatively rare.
In order to further expand and deepen the theory and application of soft lenses, combined with eye-control interaction technology, a more intelligent lens was designed to imitate a human eye lens. The effect of the DEA layout, lens size, eye movement signal, and applied voltage on the adaptive interactive lens was studied. Finally, experiments at room temperature were conducted to verify the theoretical model.

2. Soft Lens Design and Eye Interactive Control

2.1. Soft Lens Design

Acrylate copolymers are one of the most well-studied DE materials, and these elastomers exhibit great driving strain, pressure, and energy density when high voltages are applied. However, most of these polymers have a certain amount of viscoelasticity, which causes them to respond slowly. The prestretching is the key parameter to help them achieve greater electro-induced strain and higher energy density. Electromechanical instability (EMI) could also induce thin films with low elastic modulus breakdown and slow response. Acrylic copolymer VHB 4905/4910 from the 3M company (St. Paul, MN, USA) was used to make soft lenses in this work. Electrocardiogram conductive paste (ionic gel) was used as the flexible electrode.
The motion deformation of the bionic lens was controlled by EOG (electrooculogram) signals, which reflected the changes in the potential difference between the cornea and the fundus (retina). Figure 1a–c illustrate the schematic diagram. The human right eye is shown in Figure 1a. The left and right sides of a human eye are the inner and outer rectus muscles. The upper and lower rectus muscles and oblique muscles are distributed in the eyeball. Figure 1b is a bionic lens based on carbon grease electrodes. The DE film wraps the transparent medium (liquid/hydrogel). Four actuators are distributed around the lens, and the supporting frame is made of a hard acrylic sheet.
Figure 1c is the lens based on transparent hydrogel electrodes, which are the conductive paste. The actual lens is shown in Figure 1d, and the prestretching λ is 2.5 in the left picture; the diameter d is 1 cm. The lens is sealed with water, and there are four carbon grease actuators distributed around it. In the right picture of Figure 1d, λ is 2.5 and d is 1.9 cm. Water is encapsulated inside the lens, and four actuators with hydrogel electrodes are around it.
The negative-pressure method is mainly used in lens production. The convex lens is stable, as shown in Figure 2a,b. Using transparent conductive paste as the actuator electrodes and filling medium, the finished lens in the original state is displayed in Figure 2c. No voltage is applied; the prestretching ratio λ of the lens film can be set between 1 and 4. The height of the lens is h × 2. In the actuated state, the overall height of the lens becomes 2.4 h × 2 (Figure 2d), and the voltage is 3 kV. The applied voltage range is from 0 to 6 kV in experiments.
The 20P15 model of the Glassman FR Series is used for the high-voltage power supply. The product receives PC instructions through the serial port and returns its status information. LabVIEW software (MATLAB R2023b) controls the high-voltage power supply. Firstly, open the serial port and configure information, then send control instructions to the Glassman voltage amplifier, finally receive the feedback from the voltage amplifier, and close the serial port. The voltage control algorithm is completed in LabVIEW for the lens.

2.2. Eye Movement Interactive Control

The eye movement interactive control design of the bionic lens is displayed in Figure 3, which includes the EOG sensor acquisition module, MCU control module, high-voltage relay group, high-voltage power supply, soft lens, and the upper computer analysis and processing module. The signal acquisition circuit is used to collect the epidermal electrical signals around the eyes. In order to reduce the offset and obtain a suitable resistance signal, the skin needs to be cleaned using alcohol or water firstly. To measure the eyeball potential, electrodes are usually placed on the skin on the left and right sides of the eyes to measure horizontal motions. And they are placed on the upper and lower sides of the eyes to measure vertical movements. The ground electrode is placed on the forehead.
The division control designs of the EOG sensor acquisition module, microprogrammed control unit (MCU) control module, and welding design of the high-voltage relay group are as follows. The EOG sensor (Shenzhen Youxin Electronic Technology Co., Ltd., Shenzhen, China) acquisition module uses two AD8232 integrated (Shenzhen Youxin Electronic Technology Co., Ltd., Shenzhen, China) to collect the up–down and the left–right eye movement signals, respectively. After internal amplification processing, the signal is mapped to the analog voltage of 0–3.3 V.
The instruction set and format of the response data frame are shown in Table 1 and Table 2. When the MCU receives the instruction to query the connection status of the lens, the instruction is to close the relay and disconnect the relay. They are all 0 x 00, such as the upper and lower channels, the left and right channels, and the timestamp data. When receiving the instruction to query EOG sensor data, the upper–lower and the left–right channel data are assigned to the analog-to-digital conversion. The timestamp data is assigned when the instruction is executed, and it is zero at the MCU power-on moment.
The MCU core control module adopts STM32F103C8T6 as the main control chip (Shenzhen Youxin Electronic Technology Co., Ltd., Shenzhen, China), and the GPIO pin of the chip is utilized to control the opening and closing of the high-voltage relay group. The ADC pin converts the analog voltage signals from the AD8232 module into digital signals. The main control chip uses a TTL serial port to receive the instructions of the upper computer and send back the response data frame. The control schematic is shown in the left half of Figure 4. The high-voltage relay group module is composed of four relays (type: CRSTHV-5-1U-6k-9) (Ningbo Yinzhou Port Relay Co., Ltd., Ningbo, China.) and peripheral circuits. The low-voltage control end of the relay is connected to the GPIO pin of the MCU. The high-voltage control end has two normally open contacts, which are connected to the soft lens and the high-voltage power device. When the GPIO pin outputs high voltage, the normally open contact of the relay closes. When its output is low, the normally open contact is disconnected (in the right part of Figure 4).
The four DEAs are around the bionic lens coated with electrodes, and the positive and negative terminals are extracted, respectively. The positive terminal is connected to the output of the relay group, and the negative one is connected to the power. When the normally open contact of the relay is closed, the high voltage is applied to the DEA. Then, DE films deform to actuate the motion of the soft lens.
The upper computer analysis and processing module is designed based on the designer development platform of MATLAB software. Two serial ports are opened through an API call, and they are used to communicate with the single-chip microcomputer module and the high-voltage power device, respectively. The host computer first opens the serial port to connect with the MCU module and the high-voltage power, then sends the instruction of EOG sensor data to the MCU at the set frequency, while receives the data frame sent back by the single chip. It is analyzed according to the frame format definition. Then the upper, lower, left, and right channels, and timestamp data are obtained. The signal waveforms of the two channels are reconstructed.
The upper computer software utilizes the face-detection feature extraction algorithm [24] to process the features of unidirectional eye movements. Figure 5a exhibits the original EOG signal of the subject in an experiment. Data in the range of 0–210 s are intercepted for analysis. After approximate zero averaging, five points of window width are selected for filtering and the moving average. The blue curve in Figure 5b is the waveform after moving average filtering. The limiting filter thresholds for the upper, lower, left, and right channels are set as shown in the orange curves. According to the results of the limiting filter, the formula is taken as sin 2 π n 21 ,   n = 0 ,   1 ,   ,   21 , which is then considered as the reference signal. Then the motion correlation coefficient is calculated from the results of the limiting filter.
Furthermore, the state machine for the feature extraction of unidirectional eye movement is illustrated in Figure 6. By setting appropriate values for the variables X, thresh_up, thresh_down, thresh_left, and thresh_right, the state machine can extract unidirectional eye-movement features from the movement correlation coefficient curve when the movements occur and end. According to eye movements, the corresponding instructions are sent to the MCU module to control the relay group.
The soft lens moves in the same direction as the eye movement. Table 3 records the time when unidirectional eye movement occurs and ends in the waveform (Figure 5a), including the results and errors of the state machine recognition (Figure 6). The errors are also given in Table 3, and the recognition results of the state machine are accurate.

3. Data Analysis and Result Discussion

In the initial state, the lens radius is A, and the surface area is s A = π A 2 . The shape of the lens is regarded as a sphere, and the curvature radius is R. Its projection in the middle plane is a circle, the radius is a, and the distance between the sphere’s top and the middle plane is t in Figure 7a. Then the geometric relationship exists as R 2 R t 2 = a 2 . The relationship between the curvature radius R and the volume V of the liquid in a spherical lens is R = V π t 2 + t 3 . It is deduced as V = π × ( R t 2 t 3 3 ) [25,26,27]. The relationship between the focal length and the curvature radius is 1 f = n 1 n m 1 × 1 R 1 1 R 2 , where n1 and nm are the refractive indices of the lens and the filling medium, respectively. Because the lens works in air, nm equals 1. The two curvature radii of a convex lens satisfy the equation R2 = R1 = R.
To verify the established focal-length model, a liquid lens based on DE was designed and fabricated as shown in Figure 7b. According to the convex lens imaging law, u is set as the distance from the object to the lens, v is the distance from the image to the lens, and f is the lens focal length. Their relationship can be represented as 1 u + 1 v = 1 f . The double focal-length method (u = 2f, inverted and equal-sized real image) was used to measure the focal-length change before and after applying voltages.
Laboratory results revealed that the larger the initial focal length, the area of coated electrodes, and the biaxial pre-stretching multiple, the greater the focal-length changes [28,29,30,31]. The experimental data records are listed in Table 4, and the voltage–focal length function is demonstrated in Figure 7c. After performing equal biaxial fourfold prestretching on the VHB film (λ = 4) and applying the voltage (7.8 kV), the focal length varied from 112.5 mm to 49.7 mm. The maximum focal-length change could achieve 58.9% in the range of 95.3~39.2 mm.
The interactive movement is controlled by the interface button in real time. For example, the serial communication is responsible for receiving data from STM32, and the create and save buttons can execute instructions to create files and save ophthalmic data, respectively. The start and pause buttons control the experimental start and end in MATLAB software. The interface of the ophthalmic signal channel shows the signal acquisition and processing effect, which includes the double-channel ophthalmic signal after preprocessing, and the energy waveform of each frame after being split.
The results of hardware and software designs are verified through experiments in Figure 8. The adaptive lens can recognize human eye movements and zoom-control signals successfully. Through the circuits and MATLAB control program, the function of synchronizing interaction between the soft lens and eye movements is realized. In the reference state, the driving voltage Φ is 0 kV, the lens is facing the number 8 in the center of Figure 8(ai), and the distance of each number is about 4 mm. An observation of the control effect of the lens in the up, down, left and right movements, specifically looking at data points 7, 1, 9, and 5 around the number 8, was performed to calculate the lens moving distance in each direction (about 2 mm). In addition, for human strabismus, movements such as oblique up or down and other directions were observed, and the efficiency of the recognition method remained normal.
Furthermore, upon observing the size change of the number 8 to determine the zooming effect, the number 8 appears significantly enlarged. The lens film in Figure 8 is prestretched by 3, the diameter of the lens is 1.2 cm, and the actuating voltage is 4 kV. The soft lens can achieve a moving distance of 5 mm with the voltage. Since the distance between the number and the lens is less than the focal length f, the resulting image is virtual. In short, the response time of the adaptive lens was less than 1 s relative to human vision, and eye movement interaction efficiency was more than 93% during the experiments. Data-driven interactive lenses can be reused for several weeks in the absence of electrical breakdown, internal liquid leakage, or cracking of hydrogel electrodes.

4. Conclusions

Based on the dielectric elastomer actuator, the lens structure and its focal-length change model were constructed. The ophthalmic signals were collected by a bipolar electrode lead, and the control circuit was designed to amplify the signals transmitted to the upper computer. Meanwhile, the synchronous eye movement signals were controlled by the high-voltage relay group. The quadruple prestretched lens (λ = 4) achieved a controllable focal-length change of 58.9%.
The MATLAB software was utilized to amplify and filter ophthalmic signals. The recognition algorithm was designed based on the signals to identify eye movements, which completed the linear online interaction between human eyes and the bionic lens. Up and down, left and right, and zoom motions of the lens were controlled in real time. Comparing the previous interaction of soft devices, most of them were pre-controlled or manually controlled. The online interaction mode of the adaptive lens had a control delay of 0.8 s, and the eye-control efficiency was more than 93% during the experiments.
In the future, deep learning algorithms can be introduced to mark and train the electrical signals generated by various eye movements to improve the recognition accuracy. In brief, with the continuous update and iteration of optical instrument technologies, soft adaptive lenses will have broad application prospects in the fields of unmanned aerial vehicles (UAVs), optical communication, medical diagnosis, robot vision, consumer electronics, and wearable augmented reality (AR)/virtual reality (VR) devices.

Author Contributions

Conceptualization, H.Z.; Methodology and Validation, Z.X. and Z.Z.; Writing—original draft, H.Z.; Review and editing, J.Z.All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Youth Foundation (Grant No. 52405304).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Zhao, Z.Z.; Kuang, F.L.; Zhang, N.H.; Li, L. Adaptive liquid lens with tunable aperture. IEEE Photonics Technol. Lett. 2021, 33, 1297–1300. [Google Scholar] [CrossRef]
  2. Zhang, C.; Qin, J.G.; Gao, Y.; Cao, L.L.; Liu, X.J.; Zhu, Z.C. Soft lenses with large focal length tuning range based on stacked PVC gel actuators. Smart Mater. Struct. 2024, 33, 095002. [Google Scholar] [CrossRef]
  3. Liu, Y.; Yue, S.; Tian, Z.; Zhu, Z.J.; Li, Y.J.; Chen, X.Y.; Wang, Z.L.; Yu, Z.Z.; Yang, D. Self-Powered and Self-Healable Extraocular-Muscle-Like Actuator Based on Dielectric Elastomer Actuator and Triboelectric Nanogenerator. Adv. Mater. 2024, 36, 2309893. [Google Scholar] [CrossRef]
  4. Li, G.R.; Chen, X.P.; Zhou, F.H.; Liang, Y.M.; Xiao, Y.H.; Cao, X.; Zhang, Z.; Zhang, M.Q.; Wu, B.S.; Yin, S.; et al. Self-powered soft robot in the Mariana Trench. Nature 2021, 591, 66–71. [Google Scholar] [CrossRef]
  5. Sirbu, I.D.; Moretti, G.; Bortolotti, G.; Bolignari, M.; Diré, S.; Fambri, L.; Vertechy, R.; Fontana, M. Electrostatic bellow muscle actuators and energy harvesters that stack up. Sci. Robot. 2021, 6, eaaz5796. [Google Scholar] [CrossRef]
  6. Xie, X.; Zheng, S.; Tan, J.; Cheng, J.Z.; Cai, J.C.; Xu, Z.S.; Shiju, E. An Integrated Charge Excitation Alternative Current Dielectric Elastomer Generator for Joint Motion Energy Harvesting. Adv. Mater. Technol. 2024, 9, 2301172. [Google Scholar] [CrossRef]
  7. Deaconescu, A.; Deaconescu, T. Compliant Parallel Asymmetrical Gripper System. Technologies 2025, 13, 86. [Google Scholar] [CrossRef]
  8. Wang, J.; Xu, S.; Dai, Y.; Gao, S. An Eye Tracking and Brain–Computer Interface-Based Human–Environment Interactive System for Amyotrophic Lateral Sclerosis Patients. IEEE Sens. J. 2023, 23, 24095–24106. [Google Scholar] [CrossRef]
  9. Li, J.R.; Wang, Y.; Liu, L.W.; Xu, S.; Liu, Y.J.; Leng, J.S.; Cai, S.Q. A biomimetic soft lens controlled by electrooculographic signal. Adv. Funct. Mater. 2019, 29, 1903762. [Google Scholar] [CrossRef]
  10. Kaur, A. Wheelchair control for disabled patients using EMG/EOG based human machine interface: A review. J. Med. Eng. Technol. 2021, 45, 61–74. [Google Scholar] [CrossRef]
  11. Ileri, R.; Latifolu, F.; Demirci, E. A novel approach for detection of dyslexia using convolutional neural network with EOG signals. Med. Biol. Eng. Comput. 2022, 60, 3041–3055. [Google Scholar] [CrossRef]
  12. He, Q.S.; Yin, G.X.; Vokoun, D.; Shen, Q.; Lu, J.; Liu, X.; Xu, X.; Yu, M.; Dai, Z. Review on improvement, modeling, and application of ionic polymer metal composite artificial muscle. J. Bionic Eng. 2022, 19, 279–298. [Google Scholar] [CrossRef]
  13. O’Neill, M.R.; Sessions, D.; Arora, N.; Chen, V.W.; Juhl, A.; Huff, G.H.; Rudykh, S.; Shepherd, R.F.; Buskohl, P.R. Dielectric Elastomer Architectures with Strain–Tunable Permittivity. Adv. Mater. Technol. 2022, 7, 2200296. [Google Scholar] [CrossRef]
  14. Tomori, H.; Hiyoshi, K.; Kimura, S.; Ishiguri, N.; Iwata, T. A Self-Deformation Robot Design Incorporating Bending-Type Pneumatic Artificial Muscles. Technologies 2019, 7, 51. [Google Scholar] [CrossRef]
  15. Li, T.F.; Li, G.R.; Liang, Y.M.; Cheng, T.Y.; Dai, J.; Yang, X.; Liu, B.Y.; Zeng, Z.D.; Huang, Z.L.; Luo, Y.W.; et al. Fast-moving soft electronic fish. Sci. Adv. 2017, 3, e1602045. [Google Scholar] [CrossRef] [PubMed]
  16. Zhu, J.; Wen, H.; Zhang, H.; Huang, P.L.; Liu, L.; Hu, H.Y. Recent advances in biodegradable electronics- from fundament to the next-generation multi-functional, medical and environmental device. Sustain. Mater. Technol. 2023, 35, e00530. [Google Scholar] [CrossRef]
  17. Nam, S.; Yun, S.; Yoon, J.W.; Park, S.; Park, S.K.; Mun, S.; Park, B.; Kyung, K.U. A robust soft lens for tunable camera application using dielectric elastomer actuators. Soft Robot. 2018, 5, 777–782. [Google Scholar] [CrossRef]
  18. Suo, Z.G. Theory of dielectric elastomers. Acta Mech. Solida Sin. 2010, 23, 549–578. [Google Scholar] [CrossRef]
  19. Carpi, F.; Frediani, G.; Turco, S.; Rossi, D.D. Bioinspired tunable lens with muscle-like electroactive elastomers. Adv. Funct. Mater. 2011, 21, 4152–4158. [Google Scholar] [CrossRef]
  20. Wang, Y.Z.; Li, P.; Gupta, U.; Ouyang, J.; Zhu, J. Tunable soft lens of large focal length change. Soft Robot. 2022, 9, 705–712. [Google Scholar] [CrossRef]
  21. Lau, G.K.; La, T.G.; Shiau, L.L.; Tan, A.W.Y. Challenges of using dielectric elastomer actuators to tune liquid lens. Proc. SPIE-Int. Soc. Opt. Eng. 2014, 9056, 90561J. [Google Scholar]
  22. Zhong, H.; Xue, Q.; Li, J.M.; He, Y.; Xie, Y.; Yang, C. Stretchable transparent polyelectrolyte elastomers for all-Solid tunable lenses of excellent stability based on electro–mechano–optical coupling. Adv. Mater. Technol. 2022, 8, 2200947. [Google Scholar] [CrossRef]
  23. Yin, X.C.; Zhou, P.Y.; Wen, S.; Zhang, J.T. Origami improved dielectric elastomer actuation for tunable lens. IEEE Trans. Instrum. Meas. 2022, 71, 7502709. [Google Scholar] [CrossRef]
  24. Yu, Y.; Huo, H.; Liu, J. Facial expression recognition based on multi-channel fusion and lightweight neural network. Soft Comput.—Fusion Found. Methodol. Appl. 2023, 27, 18549–18563. [Google Scholar] [CrossRef]
  25. Zhang, H.; Xia, Z.J.; Zhang, Z.S.; Zhu, J.X. Miniature and tunable high voltage-driven soft electroactive biconvex lenses for optical visual identification. J. Micromech. Microeng. 2022, 32, 064004. [Google Scholar] [CrossRef]
  26. Wang, Q.; Cao, Y.J.; Wang, Y.N.; Liu, J.; Xie, Y.X. A computational model of bio-inspired tunable lenses. Mech. Based Des. Struct. Mach. 2018, 46, 800–808. [Google Scholar] [CrossRef]
  27. Pieroni, M.; Lagomarsini, C.; De Rossi, D.; Carpi, F. Electrically tunable soft solid lens inspired by reptile and bird accommodation. Bioinspir. Biomim. 2016, 11, 065003. [Google Scholar] [CrossRef]
  28. Zhu, J.; Sun, B.; Xi, M.; Zhan, Y.; Zhang, B.; Zhu, Y. Flexible electronics in humanoid five senses for the era of artificial intelligence of things (AIoT). Mater. Today 2025, 88, 1066–1086. [Google Scholar] [CrossRef]
  29. Zhang, H.; Xia, Z.J.; Zhang, Z.S. Focus-tunable imaging analyses of the liquid lens based on dielectric elastomer actuator. Bull. Mater. Sci. 2021, 44, 148. [Google Scholar] [CrossRef]
  30. Zhang, H.; Zhu, J.X.; Wen, H.Y.; Xia, Z.J.; Zhang, Z.S. Biomimetic human eyes in adaptive lenses with conductive gels. J. Mech. Behav. Biomed. Mater. 2023, 139, 105689. [Google Scholar] [CrossRef]
  31. Zhang, H.Y.; Hong, J.; Zhu, J.; Duan, S.; Xia, M.; Chen, J.; Sun, B.; Xi, M.; Gao, F.; Xiao, Y.; et al. Humanoid electronic-skin technology for the era of artificial intelligence of things (AIoT). Matter 2025, 50, 428–438. [Google Scholar]
Figure 1. The principal diagram of human lens imitation. (a) The schematic of the human right eye. (b) The soft lens based on carbon grease (CG) electrodes. (c) The lens based on transparent hydrogel electrodes. (d) Physical display.
Figure 1. The principal diagram of human lens imitation. (a) The schematic of the human right eye. (b) The soft lens based on carbon grease (CG) electrodes. (c) The lens based on transparent hydrogel electrodes. (d) Physical display.
Technologies 14 00068 g001
Figure 2. (a) Creating a negative pressure area. (b) The completed lens. (c) The adaptive lens in the initial state. (d) The deformation state at a voltage of 3 kV.
Figure 2. (a) Creating a negative pressure area. (b) The completed lens. (c) The adaptive lens in the initial state. (d) The deformation state at a voltage of 3 kV.
Technologies 14 00068 g002
Figure 3. Design for bionic lens control system.
Figure 3. Design for bionic lens control system.
Technologies 14 00068 g003
Figure 4. Design of a real-time control circuit for an eye-controlled lens.
Figure 4. Design of a real-time control circuit for an eye-controlled lens.
Technologies 14 00068 g004
Figure 5. One-way eye-movement feature extraction algorithm. (a) Original data waveform. (b) Moving average filtering waveform and limiting filter waveform.
Figure 5. One-way eye-movement feature extraction algorithm. (a) Original data waveform. (b) Moving average filtering waveform and limiting filter waveform.
Technologies 14 00068 g005
Figure 6. State machine for feature extraction of unidirectional eye movement.
Figure 6. State machine for feature extraction of unidirectional eye movement.
Technologies 14 00068 g006
Figure 7. (a) Physical parameters of a liquid lens. (b) Eye-controlled lens experimental setup. (c) The actuating voltage as a function of the focal length.
Figure 7. (a) Physical parameters of a liquid lens. (b) Eye-controlled lens experimental setup. (c) The actuating voltage as a function of the focal length.
Technologies 14 00068 g007
Figure 8. Real-time control experiments of the interaction lens. (a) (i)–(iii) represent the reference data of eye movement tests, before zooming (voltage Φ = 0), and after zooming (4 kV), respectively. (b) The eyes look to the left while controlling the lens to move to the left. (c) Look to the right while interacting. (d) The control effect of looking upwards. (e) Looking down.
Figure 8. Real-time control experiments of the interaction lens. (a) (i)–(iii) represent the reference data of eye movement tests, before zooming (voltage Φ = 0), and after zooming (4 kV), respectively. (b) The eyes look to the left while controlling the lens to move to the left. (c) Look to the right while interacting. (d) The control effect of looking upwards. (e) Looking down.
Technologies 14 00068 g008
Table 1. Instructions received by the MCU.
Table 1. Instructions received by the MCU.
Command (Hex)Function
10Query device connection status
20Query EOG sensor data
3 *Close specified relays (The values of * are 1, 2, 4, and 8,
representing relays A, B, C, and D, respectively)
4 *Disconnect specified relays (* Take 1, 2, 4, 8,
representing relays A, B, C, and D, respectively)
Table 2. Data frame format of microcontroller responses.
Table 2. Data frame format of microcontroller responses.
FieldLength (Byte)Annotation
Frame header10 x EB
The executed instructions10 x 10
0 x 20
0 x 3 *
0 x 4 *
Execution state10 x 80
Up and down channel data2ADC value
Left and right channel data2ADC value
Timestamp4Millisecond
Check code1Perform CRC8 check on the first 11 bytes
* Take 1, 2, 4, 8, representing relays A, B, C, and D, respectively.
Table 3. The start and the end time, recognition results, and errors of unidirectional eye movements.
Table 3. The start and the end time, recognition results, and errors of unidirectional eye movements.
ActionActual Moment/sIdentify Result/sAbsolute Error/sActual Moment/sIdentify Result/sAbsolute Error/s
Look left10.8710.000.87107.91106.601.31
Stare ahead18.6218.290.33115.33114.890.44
Look right35.8934.801.09132.33131.340.99
Stare ahead42.5542.330.22139.03138.710.32
Look up61.7560.320.43159.21158.770.44
Stare ahead66.7567.630.88166.21166.210.00
Look down83.5681.931.63189.81188.171.64
Stare ahead88.7988.900.11194.18194.940.76
Table 4. Experimental data on the focal-length variation of the soft interactive lens with voltages.
Table 4. Experimental data on the focal-length variation of the soft interactive lens with voltages.
Voltage (kV)02.13.44.65.76.57.57.8
Focal length (mm)112.5107.4101.994.385.173.656.849.7
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, H.; Xia, Z.; Zhang, Z.; Zhu, J. Data-Driven Interactive Lens Control System Based on Dielectric Elastomer. Technologies 2026, 14, 68. https://doi.org/10.3390/technologies14010068

AMA Style

Zhang H, Xia Z, Zhang Z, Zhu J. Data-Driven Interactive Lens Control System Based on Dielectric Elastomer. Technologies. 2026; 14(1):68. https://doi.org/10.3390/technologies14010068

Chicago/Turabian Style

Zhang, Hui, Zhijie Xia, Zhisheng Zhang, and Jianxiong Zhu. 2026. "Data-Driven Interactive Lens Control System Based on Dielectric Elastomer" Technologies 14, no. 1: 68. https://doi.org/10.3390/technologies14010068

APA Style

Zhang, H., Xia, Z., Zhang, Z., & Zhu, J. (2026). Data-Driven Interactive Lens Control System Based on Dielectric Elastomer. Technologies, 14(1), 68. https://doi.org/10.3390/technologies14010068

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop