Next Article in Journal
Urban Spatial Structure and Vehicle Miles Traveled in 461 U.S. Cities
Previous Article in Journal
A Review of Theoretical, Experimental and Numerical Advances on Strain Localization in Geotechnical Materials
Previous Article in Special Issue
Low-Cost Laser Powder Bed Fusion-Based Additive Manufacturing of Densified Ceramics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Manufacturing and Control of a Robotic Arm Used in an Educational Mechatronic Platform for Laser Treatments, Followed by Cooling at Low Temperatures

by
Cristian-Gabriel Alionte
,
Edgar Moraru
*,
Andreea Dana Alionte
,
Marius-Valentin Gheorghe
and
Mircea-Iulian Nistor
Mechatronics and Precision Mechanics Department, Faculty of Mechanical Engineering and Mechatronics, University Politehnica of Bucharest, 060042 Bucharest, Romania
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(22), 12157; https://doi.org/10.3390/app152212157
Submission received: 29 October 2025 / Revised: 12 November 2025 / Accepted: 14 November 2025 / Published: 16 November 2025

Abstract

In this paper, we present a mechatronic platform that must be used for the handling of thermally processed samples using laser equipment, after which it is cooled at low temperatures. In addition to the laser and cryogenic equipment, the mechatronic platform includes one robotic arm (with a new modular structure that allows it to adapt to different working places) for sample transfer between storage areas, a controlling system for the robotic arm based on a new haptic device with physical feedback, a laser system, a cryogenic system, and an optical thermal processing measurement system. A new VR application enables remote control of a robotic arm, ensuring user safety using a haptic device based on the VR model. We exemplified the process of manufacturing the parts for the robotic arm and glove using a 3D printing method.

1. Introduction

Heat treatment is an important step in the production of metal parts and components because it improves the mechanical and physical properties of the material. There are two types of heat treatment: laser heat treatment and classical heat treatment. Laser heat treatment is the process of heating the surface of a material using a focused beam of laser energy. The precision of this heat treatment allows for the selective heating of specific regions within an object of interest. Applications that require precise control over the heat energy input, like the aerospace and medical industries, frequently employ laser heat treatment.
Traditional heat treatment methods, on the other hand, entail heating a material to a specific temperature and then cooling it at a regulated rate using conventional heat sources, such as dedicated furnaces. Annealing, quenching, tempering, and normalizing are examples of classical heat treatment procedures. The manufacturing industry widely uses these technologies to enhance the mechanical qualities of materials, including mechanical strength, hardness, wear resistance, or toughness.
Low-temperature treatment is the process of cooling materials to extremely low temperatures, often below −50 °C (in the case of cryogeny, the temperature is below −150 °C), to improve their physical and mechanical qualities. This process promotes the release of internal tensions in the material and can cause microstructural changes, which contribute to an increase in material properties.
Heat treatment and low-temperature cooling entail heating and cooling material in alternating cycles. This technique can have a variety of effects on the material, depending on its composition and qualities. Alternating heating and cooling cycles, for instance, can help produce a material with optimized mechanical properties, greater dimensional stability, and machinability. Studies [1,2,3,4,5] frequently employ cryogenic treatment in conjunction with heat treatment procedures to enhance the properties of metals and alloys. Following the initial heat treatment process, the material may still have residual stresses, which might reduce its characteristics and durability. Cryogenic treatment can assist the release of these tensions and cause favorable modifications to the material’s microstructure. Overall, cryogenic cooling after heat treatment can provide various benefits, the most significant of which are that it
  • Improves mechanical qualities, including hardness and wear resistance.
  • Increases dimensional stability and durability.
  • Improves resistance to corrosion and other chemical phenomena.
  • Enhances fatigue resistance and overall performance.
To summarize, the symbiosis of heat treatment with cryogenic cooling can have a significant impact on the overall performance of a material that is subjected to such an approach by obtaining superior properties for this material, making it more suitable for a wide range of critical applications such as aerospace, medical, or automotive.
Based on these basic ideas, we think that university research needs to create an independent mechatronic platform so that undergraduate, graduate, and doctoral students can safely study what happens to tiny mechanical parts made of different materials when these heat treatments are used alone or together.
Laser equipment can have a variety of impacts [6]. Since the eye is particularly vulnerable to laser light, we can discuss ocular hazards. Many reported injuries involve the eye and can be caused by direct beams, mirror reflections from other equipment, or beam reflections off the skin. Laser-generated airborne pollutants, such as chemicals (particularly smoke) and material nanoparticles, pose a significant risk as well. Class 4 lasers are known to present fire hazards, which is another area of worry for laser users. Often, laser users face two types of fire hazards: flash fires, which happen when the laser beam encounters flammable material, and electrical fires within the laser, occasionally stemming from faulty wiring.
Human hazards from low temperatures and cryogenic equipment can include skin burns, over-pressurization, asphyxiation, respiratory irritation, fire, and explosion, as well as direct contact with the material surface (skin stuck to cold surfaces, skin burns) or leaks, sprays, or spills of the cryogenic fluids [7,8].
We suggest the use of an autonomous stand that may be operated remotely safely with the objective of reducing these harmful and undesirable impacts, particularly because we are discussing the issue of young students that fail to comply with the safety regulations or pay close attention to the teacher. Figure 1 shows a model of the suggested educational testing mechatronic platform, starting from a robotic arm with a gripper, which has the role of manipulating the treated specimens between and after the thermal treatments. A VR remote control has the following advantages [9,10]: it offers enhanced operator safety because the user is not in the proximity of any hazardous environment, reducing exposure to physical risks; intuitive and immersive control because VR models enhance the user’s spatial perception and hand–eye coordination, helping operators execute complex tasks more intuitively than traditional interfaces; it offers flexible training and skill adaptation, and the users can learn and adapt faster in VR environments, speeding up deployment and improving task outcomes; and it offers real-time environment visualization, improving awareness. This type of control also has some disadvantages; the most important is that the speed of the robotic arm is slow because the inherent latency and communication delays can induce user discomfort and fatigue, which affects operator performance. Reliability is reduced because of the signal loss, sensor tracking failures, and hardware complexity, which create challenges for consistent and accurate robot control. Usually, haptic devices have limited tactile feedback [11], reducing performance in manipulation tasks. Our proposed glove provides this type of feedback to each finger used for control.
The mechanical part of the robotic system prototype was designed based on the analysis of constructive and technological solutions for different applications [12,13,14,15,16,17,18,19] to obtain an optimal and accessible variant. Particular importance for the manipulation of specific objects is represented by the prehension part (gripper) [20,21,22,23,24,25,26,27,28,29] of the robot and the related materials, which will be analyzed in future studies. Regarding the laser equipment and considering the final goal of the concept, it can be used as laser equipment for heat treatment or even for micro-welding (for precise laser processing) with the adjustment of power, pulse, optics, or other processing parameters with a solid active medium Nd: YAG and a wavelength of 1.06 µm. Regarding the second heat treatment equipment, a classic installation with a cryogenic atmosphere (liquid nitrogen) can be used in the first phase, and the robotic arm will have the role of manipulating the specimens between and after the thermal processes. Although there are recent studies in the literature outlining the beneficial effects of combined thermal treatments (heat treatment and cryogenic) on material properties [1,2,3,4,5], their handling between treatments is not approached in detail. The novelty of the paper refers to the reconfigurability and rapid adaptability/customization of the robotic arm depending on the geometry of the specimen that will need to be manipulated and the concept itself of a mechatronic platform for this purpose, which can be expanded with motorized platforms for bringing parts into the working area of the robotic arm (between the equipment), considering that in the first phase the robotic arm, due to the materials used, will not be able to operate directly in aggressive thermal environments. Additive manufacturing is the best option for adaptability/rapid prototyping and replacement of subassemblies, depending on the required object. But it has the potential to produce components through specific technologies, even for direct manipulation, including in aggressive environments. In addition, the benefit of the implementation of this concept lies in interdisciplinary educational purposes, including diversified branches like robotics, mechatronics, control engineering, material science, manufacturing technologies, and so on.
The educational mechatronic platform comprises one robotic positioning system that has two roles. The first is to transport the object from the storage to the laser thermal processing system and back, and the second is to transport the object from the storage to the low-temperature thermal processing system and finally to the measurement system. A controller remotely controls the robotic positioning system, using a glove and a VR environment. This system uses a thermal camera to measure the part’s temperature and determine the optimal moment to handle the mechanical part. This ensures the gripper is protected and can operate for a longer period.
Therefore, the paper aims to present a concept of a mechatronic platform applied to the manipulation of structures during thermal treatments, namely laser thermal treatment, followed by cooling at low temperatures. Namely, the main subsystems that make up the mechatronic platform are presented with an emphasis on the robotic positioning system, the manufacturing of the robotic positioning system, and the glove using 3D printing, and to implement software using VR that allows the user to interact with the 3D model and with the robotic positioning system. Also, important technological aspects are discussed for the manufacturing of the mechanical structure of the prototype for further development, testing, and optimization of the system. The methodology of the work is illustrated in Figure 2.

2. Materials and Methods

2.1. Robotic Positioning Systems

The robotic positioning system is the key component of the mechatronic platform. The primary focus is on simplifying the creation process through additive manufacturing, minimizing the number of components, minimizing the amount of material used in production, and implementing a simple control system.
Because we use a robotic positioning system that can have different dimensions and shapes, it is important to apply the command and control of the robotic arm without changing the controlling unit.
In this work, we present a robotic positioning system, which is actually a reprogrammable multifunctional manipulator designed to handle materials, components, tools, or specialized devices using programmed movements. This enables them to perform a variety of tasks at a high-performance level and to gather information from the work environment to facilitate movement. They feature three simple articulated arm rotations, resembling articulated toroidal, revolution, or anthropomorphic robot arms.
The robotic positioning system has 4 degrees of freedom: one rotation of the fixed base, three rotations of the arms, and one displacement of the gripper fingers—Figure 3.
The fixed base contains the stepper motor, which is used for the frame’s rotational movement, as well as components required for this movement, such as the gear wheel, which is positioned on the axis of the stepper motor, and the ring gear, which is fastened on the inner side surface of the frame and is engaged by the gear wheel mounted on the shaft of the stepper motor, in support. It also contains a central shaft, on which is placed a radial bearing that allows the frame to rotate and is housed in a bearing holder. Additionally, the fixed base incorporates two raceways for bearing balls, each secured in a ring; this assembly serves as an axial bearing, supporting the arm’s weight. Between the upper section of the frame and the bottom half, there is an axial bearing supporting the weight of the robotic positioning system.
The robotic arm, in the first configuration, used in the testing has three rotations, and thus, the kinematic scheme and the control scheme are similar. According to Figure 4, element 1 is a column, element 2 is an arm, and element 3 is a forearm. The rotations are in rotational kinematic couplings A0 (1, 2), A (1, 2), and B (2, 3), and the whole mechanism has 3 degrees of freedom. This configuration is not limited, and the robot can have multiple elements structured in a chain from the base to the gripper.
In order to control the robot, we need to know the position of the gripper in space, and thus, we must apply the inverse kinematics of the mechanism of the robotic arm. This involves knowing the instantaneous position of the tracer point P on an imposed trajectory and determining the position functions of the active elements. The equations of the position of point P are defined, considering a Cartesian coordinate system, according to the notations in Figure 4b:
[ d 1 + d 2 c o s φ 2 + d 3 cos φ 2 + φ 3 ] c o s φ 1 ( a 2 a 3 ) s i n φ 1 = x p
[ d 1 + d 2 c o s φ 2 + d 3 cos φ 2 + φ 3 ] s i n φ 1 + ( a 2 a 3 ) c o s φ 1 = y p
d 2 s i n φ 2 + d 3 sin φ 2 + φ 3 = z p a 1
To solve the system of equations more easily, we can consider a specific case where a2 = a3, and using the first two equations of the system (1), angle φ1 can be obtained from the equation
φ 1 = a r c t g ( y p x p ) ,
The system of Equation (1) can be simplified by using Equation (2):
d 2 c o s φ 2 + d 3 cos φ 2 + φ 3 = x p 2 + y p 2 d 1
d 2 s i n φ 2 + d 3 sin φ 2 + φ 3 = z p a 1
From Equation (3) the angle ( φ 2 + φ 3 )   can be eliminated, and the value of φ 2 can be obtained:
φ 2 = 2 a r c t g ( A ± A 2 + B 2 C 2 B + C ) ,
The coefficients A, B and C used in Equation (4) are defined by equations:
A = 2 d 1 ( z p a 1 ) ,
B = 2 d 2 x p 2 + y p 2 d 1
C = x p 2 + y p 2 + z p a 1 2 + d 1 2 + d 2 2 d 3 2
Knowing angles φ1 and φ2, the angle φ3 is obtained using the equation
φ 3 = φ 2 + 2 a r c s i n ( z p a 1 d 2 s i n φ 2 d 3 ) ,
Because a robotic positioning system that can have different dimensions and shapes is used, the problem of inverse kinematics accuracy is a very complex issue. Inverse kinematics is the process of computing joint angles so that the robotic arm’s gripper is at a specific location and orientation. Its accuracy shows the error between the desired and the obtained position and orientation after moving the joints according to the inverse kinematic solver. From a single-dimension perspective, it can be said that the accuracy is
ε = x d e s i r e d x o b t a i n e d
where ε is the accuracy, xdesired is the position that the robotic arm’s gripper should have, and xobtained is the value of the measured position after the arm moved its gripper. It should be noticed that this is an oversimplified example because the position in both VR and the real world should have three Cartesian coordinates and three angle values to completely define the position and the orientation of the end-effector.
Various methods available in the literature can be used to assess the inverse kinematics accuracy, such as the one described by Zhang [30]. Other methods could include a direct measuring approach or image processing from two cameras, which could be developed into a closed feedback loop.
There are several factors that influence inverse kinematics accuracy. Among them, one can enumerate model accuracy, numerical solving precision, algorithm, joint resolution (the physical precision of actuators or encoders), signal noise, sensor drift, mechanical tolerances, backlash, and compliance in gears or links, and thermal expansion. The material used to manufacture the mechanical components of the presented model was chosen for its low cost and ease of prototyping. Therefore, their mechanical properties and stability vary.
Being an educational system with a variable number of modules, and due to the materials used, calculating accuracy is a task too difficult to present in this article, which focuses on the structure and control of the system. However, its determination can be used as an exercise for students by giving a higher or lower degree of complexity depending on their training and the objective of the application.
The next step after validating the control software of the robotic arm, the hardware structure of the VR glove, and the correlation between the VR and reality will be to manufacture robotic modules made from reliable materials that will greatly increase the inverse kinematics accuracy.
However, robotic arms have some constraints related to the position and mechanical configuration of the cryogenic system and the workpiece positioning system. They have a shape similar to a parallelepiped with a maximum weight of 200 g and with the following maximum dimensions: length 80 mm, width 55 mm, and height 6 mm. The distance between the axis of rotation of the arm from the cryogenic system handling system is 300 mm, and from the sample positioning system of the laser processing system, it is 350 mm. The robotic arm needs to rotate by 198 degrees, and the height is not dimensionally conditioned. Also, the orientation and correction of the position of the machined part is done at the gripper level, which will be presented in another article. For the reasons listed, it follows that the robotic arm in a minimum configuration (the one presented in this article) must ensure a movement between the two systems under the conditions of a positioning error of a maximum of 4 mm. The manufacturing does not insert any constraints related to speed and repeatability, which are parameters that depend on user experience.

2.2. System for Hapting Control of the Robotic Arm

The gloves employed in virtual reality are interactive devices that enhance the realism and immersion of VR experiences. They accomplish this by incorporating haptic technologies that enhance user interaction with elements of the virtual environment. These gloves enable the visual description of hand movements and gestures in a virtual environment, transmit a range of tactile sensations from fine to coarse, simulate the weight and solidity of virtual objects, allow the perception of object shape and texture on virtual surfaces, and replicate the thermal sensation of objects within the virtual domain. VR gloves seek to emulate human interaction in the physical realm with utmost precision, converting it into a digital format. The primary function of VR gloves is twofold: to capture movement by transmitting location, orientation, and hand motions in real time, and to deliver sensory input using integrated actuators.
Alternative manual input methods encompass video cameras that track finger motions without the need for additional controllers, as well as portable haptic devices that offer accurate tactile feedback. Representations of products and technologies: The built-in hand tracking technology of Meta Quest 2 (Meta Platforms, Inc., Menlo Park, USA) utilizes the VR headset’s cameras to monitor hand movements, facilitating interaction without the need for controllers. After analyzing the current market solutions and their attributes, a decision was made to develop a virtual reality glove capable of detecting hand orientation, finger positioning, and delivering force feedback. The proposed method anticipates an optimal balance of cost, complexity, and usefulness, designed for instructional, simulation, and intuitive interface applications in virtual worlds.
The assembly’s 3D modeling was performed in SolidWorks (version: 2023), providing a clear visualization of the arrangement of system components and confirming their geometric compatibility. An assembly was created for each digit. It consists of the support attached to the hand and the subassembly that restricts finger movement. The finger movement tracking and restricting subassembly consists of a bracket that supports the additional components and can be attached to the base bracket via a sliding mechanism; an SG 90 servo motor (Tower Pro Pte Ltd., Singapore) that controls finger movement; a resistive spinning sensor for detecting finger position; a drum onto which the wires connected to the fingers are coiled; a supporting component that connects the resistive rotary sensor, the internal spiral spring, the drum, and a casing that secures to the purple element; and a spiral spring.
The hand support affixed to the glove is illustrated in Figure 5 and possesses the following attributes: the device features a slender base that allows for folding and conforms to the hand’s shape; it comprises six platforms housing five subassemblies, a MPU6050 sensor (TDK InvenSense, San Jose, USA), and a microcontroller serving as the central control system. At either end of the structure, two elongated bores facilitate the attachment of an elastic band to secure it to the glove, along with a bracket that connects the base and components to monitor and restrict finger movement. The bracket serves to support and secure the electrical and mechanical components. The assembly comprises a platform for the servo motor, a substantial bore for the insertion of the resistive rotating sensor’s base, and a sliding mechanism for its attachment to the base support.
The finger movement tracking and limiting subassembly comprises a finger position measurement subassembly that utilizes a retractable key fob. The threads attached to the fingers are coiled on a drum that is affixed directly to the shaft of the resistive rotating sensor. When the finger exerts tension on the wire, the drum rotates in conjunction with the resistive spinning sensor shaft. This subassembly includes a component that links the resistive rotating sensor to the spiral arc. It possesses an interior hexagonal configuration that enables the resistive spinning sensor to be secured by a nut. Furthermore, this component features a bore and a cut in the cylindrical contour designed to accommodate and direct the spiral spring. This spring features dual recesses, with one end connected to the component depicted in the image above and the other end linked to the resistive rotating sensor shaft. This subassembly functions to convert finger movement into the rotation of the resistive rotating sensor, enabling the wires connected to the fingers to extend and retract in accordance with their motion. The spiral spring is tensioned and energized during the flexion of the finger, functioning as a resilient spring. Upon the completion of finger extension, the spring functions as the motor, retracting the wires and returning the resistive rotating sensor to the zero position.
The system includes a finger movement limiting subassembly, including a drum connected to the resistive rotating sensor shaft, a fastening component, a servo motor, and a lever affixed to the servo motor shaft. The lever is aligned at the same angle as the screw connected to the drum and operates via the servo motor alongside the drum. Upon receiving a signal, the microprocessor halts the servo motor shaft, thereby restricting the drum’s rotation.
The primary function of the glove is to establish a seamless interaction between the user and the virtual environment. The objectives include accurately determining the hand’s spatial orientation (yaw, pitch, and roll angles), estimating the relative position of each finger (degree of flexion/closure), implementing a force feedback system to simulate object grasping, and facilitating real-time data transmission to a computer that interprets movements and controls the virtual avatar. A wireless interface is employed, allowing unrestricted user mobility.
The glove comprises two primary components: a mobile module affixed to the user’s hand, housing all sensors and actuators, and a stationary reference module that receives data over WiFi and delivers it to the PC through the serial connection for real-time analysis. Figure 6 illustrates the principle of the entire assembly, which emphasizes the primary components and their interconnections. The black arrows denote the power supply, the red arrows indicate the data received from the sensors and communicated between the ESPs and the computer, and the green arrows represent the servo motor positioning commands received from the computer and transmitted to the ESP32 via the ESP8266 (Espressif Systems, Shanghai, China).
Data communication between the mobile and stationary modules occurs through WiFi, utilizing the UDP protocol for minimal latency. The glove-mounted ESP32 continuously transmits data packets, and the stationary ESP8266 issues control orders to the actuators. The ESP8266 interfaces with the computer over USB Serial, providing sensor data and receiving control commands for the actuators. The transportable construction needs an independent power source for wireless operation. The system operates on a 5 V, 2.1 A Li-Po battery with a capacity of 10,000 mAh, which supplies power to the ESP32 and, subsequently, the resistive rotating sensors. The servo motors are independently powered by this battery. This is performed to prevent short-circuiting the microcontroller. An MPU6050 (TDK InvenSense, San Jose, USA), a 6-axis accelerometer and gyroscope module, was selected to determine the hand’s orientation and is affixed to the glove. The data is derived from it, and computations are conducted to establish the spatial orientation of the hand. The device was chosen for its capability to simultaneously measure linear acceleration and angular velocity on a single chip, its superior cost-performance ratio, its suitability for educational applications and prototypes, and its widespread availability in specialized retail and online platforms, enabling prompt acquisition. It features multiple libraries and usage examples for platforms like Arduino, ESP32, and Raspberry Pi, providing configurable accelerometer ranges (±2 g to ±16 g) and gyroscope ranges (±250°/s to ±2000°/s), tailored to specific applications, thereby enhancing integration into portable systems or those with spatial constraints. It is advisable in battery-operated devices owing to minimal power consumption. It enables internal processing of motion data, hence alleviating the burden on the host microcontroller. Reliable communication interface that is straightforward to develop in various embedded architectures. It provides satisfactory performance for applications with moderate demands at a minimal cost.
Rotary resistive sensors were affixed to each finger to monitor their positions. They furnish data regarding the flexion angle, facilitating the reconstruction of the hand’s position in virtual space. Altering the angle of the shaft’s rotation modifies both the resistance and the voltage detected by the microcontroller. The sensor possesses the following attributes: the standard value is compatible with most microcontrollers for analog reading, facilitates the acquisition of an output voltage proportional to the optimal spindle rotation suitable for analog control, and is applicable in cost-effective applications.
It is readily available in both online and physical retail outlets; as a standard component, it boasts a lifespan of tens of thousands of revolutions, adequate for most embedded applications; it facilitates straightforward integration into assemblies and compact electronics; the linear relationship between angle and position voltage streamlines software signal processing; it is compatible with analog readings on Arduino/ESP32 without necessitating additional signal conditioning; and it requires no specialized programming or configuration, merely connection to analog pins.
Force feedback is accomplished with SG90 servo motors, which can limit finger movement to replicate the sensation of contacting an object. Their regulation is executed by a PWM signal produced according to the data obtained from the virtual world (from Unreal Engine). The motor possesses the following attributes: compatible with standard power supplies utilized in embedded projects, such as Arduino or Li-ion batteries, it is sufficient for lightweight applications, including the movement of robotic hand fingers or other lightweight mechanisms. It offers rapid response, making it suitable for applications requiring interactive or swift feedback, and facilitates a wide range of motion, advantageous for position control across various applications. Its compact and lightweight design is ideal for portable systems or those mounted on delicate structures, allowing for integration into small enclosures or confined spaces. It can be powered directly from a microcontroller or an external source without necessitating complex power circuits and is easily controllable via software with dedicated features on platforms such as Arduino or ESP32, making it cost-effective for prototypes and educational projects.
The proposed project entails the development of an interactive intelligent glove system that can detect finger and wrist movements, communicate data to a computer, and respond to orders by actuating servo motors. Specific hardware components were selected to execute these functions, with their operation grounded in calculations and engineering principles. The concept employs the MPU6050 sensor, a MEMS device that combines an accelerometer and a triaxial gyroscope, to accurately measure hand orientation. This sensor delivers critical information for assessing motion and spatial positioning, and is extensively utilized in robotics, virtual reality, and wearable technology applications. The MPU6050 is an inertial sensor consisting of two primary components: a triaxial accelerometer that quantifies linear acceleration across three axes, and a triaxial gyroscope that assesses angular velocity across the x, y, and z axes. MEMS sensors display inaccuracies known as biases or offsets, which must be rectified to achieve accurate measurements. The calibration occurs when stationary, by documenting and computing the average values of the measurements.
The average bias values for the accelerometer on the bax, bay, and baz axes are computed as follows:
b a x = 1 N i = 1 N a x b a y = 1 N i = 1 N a y b a z = 1 N i = 1 N a z g
where g = 9.81 [m/s2] represents the gravitational acceleration and N denotes the number of measurements. The biases bωx, bωy, and bωz for gyroscopes are computed in a comparable manner.
b ω x = 1 N i = 1 N ω x b ω y = 1 N i = 1 N ω y b ω z = 1 N i = 1 N ω z
The biases are subsequently deducted from the raw readings to eradicate persistent inaccuracies attributable to the sensor. For each purchasing cycle, the measured values are adjusted using the equation
a x c = a x b a x a y c = a y b a y a z c = a z b a z ω x c = ω x b ω x ω y c = ω y b ω y ω z c = ω z b ω z
where the c-index signifies bias-corrected values. The hand’s orientation is characterized by three rotational angles: pitch (θ), indicating forward-backward tilt (rotation about the y-axis); roll (φ), representing left-right lateral tilt (rotation about the x-axis); and drift (ψ), denoting vertical rotation (about the z-axis). The recorded acceleration incorporates gravitational force and can ascertain the sensor’s tilt in relation to the vertical axis. The pitch acceleration angles ( θ ˙ a c c ) and roll acceleration angles ( φ ˙ a c c ) are computed as follows:
θ ˙ a c c = a r t a n a y c a x c 2 + a z c 2 180 π φ ˙ a c c = a r t a n a x c a y c 2 + a z c 2 180 π
where arctan denotes the arc tangent function, and the outcome is presented in degrees. These values offer a direct estimation of movement; however, they are influenced by noise and transient accelerations resulting from motion. The gyroscope quantifies angular velocity, which necessitates integration over time to derive rotation angles.
The gyroscope’s angles include the pitch angle θgiro, the roll angle φgiro, and the drift angle ψgiro, which can be integrated using the following equations:
θ g i r o t = θ t t + ω x · t φ g i r o t = θ t t + ω y · t ψ g i r o t = θ t t + ω z · t
These values yield a rapid response to alterations in orientation but exhibit cumulative variances over time, potentially resulting in significant inaccuracies. Consequently, we may implement a complementary filter utilizing the subsequent equation:
θ t = α θ t t + ω x · t + ( 1 α ) θ a c c φ t = α φ t t + ω y · t + ( 1 α ) φ a c c ψ t = ψ t t + ω z · t
where the coefficient α = 0.96 regulates the equilibrium between gyroscope and accelerometer data. A number approaching 1 emphasizes the gyroscope for swift responsiveness, while the counterpart (1 – α) reduces drift utilizing accelerometer data.
We conducted an experiment to assess the magnitude of the difference between filtered and unfiltered readings while maintaining a stationary sensor. A discrepancy of around 2 degrees is noted, with the calibrated data exhibiting more stability than the uncalibrated sensor data. In this instance, the rolling angle exhibits a stability discrepancy and a deviation of 1.5 degrees from the actual measurement. The graphic illustrates that, although signal clarity remains unchanged, the unfiltered value consistently rises due to the accumulation of mistakes over time, whilst the filtered value remains stable. The examined mechanical system comprises a servo motor (SG90) that actuates a rigid lever, a drum around which a traction wire connected to a finger is coiled, and an eccentrically mounted limit screw on the drum that engages with the lever, restricting the drum’s rotation in one direction.
The objective of these calculations and models is to verify the system’s functioning and ensure compliance with the technical-functional requirements dictated by the design theme. This method is streamlined in Unreal Engine and condensed into a feature termed “Make Rotator,” illustrated in Figure 7.
The finger movement commences at an entirely open position, regarded as the reference position at the beginning time t0. During operation, the system processes data from the resistive rotating sensors, which are appropriately positioned on each finger, in real time. The acquired analog values are utilized in the aforementioned equations to ascertain the closing angles of each finger’s joints. Consequently, the location of each digital segment is perpetually revised in accordance with these values, guaranteeing a seamless and synchronized motion of the entire prosthetic hand.
The initial image from Figure 8 depicts the system’s condition at time t0, when all fingers are positioned in the open state. The right image of Figure 8 illustrates the hand’s final configuration at time t, with the fingers fully closed, resulting from the processing of inputs from resistive rotating sensors and the application of kinematic equations for each joint.

2.3. Mechanical Structure Design of the Controlling Device

The mechanical architecture of the VR glove system is a critical element of the entire design, directly affecting motion tracking precision, user comfort, and long-term durability. The selection and incorporation of mechanical and electronic components prioritized modularity, ergonomics, and lightweight design to facilitate a seamless experience in virtual reality. The glove comprises five modules, each corresponding to a finger, all incorporated into a central unit located on the rear of the palm. Each module includes a linear resistive rotary sensor physically affixed to monitor the flexion and extension of the corresponding finger. The motion of each finger is regulated by an SG90 servo motor, operated by a tensioned wire mechanism, enabling the fingers to open and close in response to signals from the control system.
The support for these components possesses a specified geometry to accommodate them. To fully incorporate the servo motor, adherence to the specifications defined is essential. The dimensions designated as B and D are crucial for ensuring proper and secure installation. Likewise, the dimensions of the resistive rotating sensor are illustrated in Figure 9. The measurements to be considered for the proper installation of the resistive rotating sensor are the diameter of its body and the length of the component to which the connecting pins are affixed. These dimensions are crucial for guaranteeing mechanical stability and facilitating access to electrical connections.
A crucial element in the proper assembly of the subassembly on the VR glove is the sliding mechanism, which ensures the secure movement and fixation of the components. This mechanism guarantees a consistent, rapid, and precise installation of the modules onto the bracket affixed to the hand. The dimensions of the sliding mechanism are essential for the system’s proper operation. The dimensions of 18.55 mm (width of the guiding element) and 1.55 mm (height) must be securely fixed to ensure the aforementioned parameters.
The sliding mechanism serves as a mechanical guide, facilitating the insertion and secure fixation of the module in a precise location, thereby preventing positioning errors that could compromise both the physical movement of the fingers and the data interpretation from the sensors. This technique is optimal for wearable devices, where spatial constraints exist and installation durability is crucial for safety and comfort.
The control and command system of the suggested VR glove is constructed on a distributed architecture, whereby proper functionality relies on the interaction of two microcontrollers: ESP32 and ESP8266. The ESP32 module, connected to the user’s glove, primarily functions to gather data from the sensors and regulate the drives. The ESP8266 module is linked to a computer and facilitates connection with the virtual world developed in Unreal Engine 5.
The collection of finger position data is executed by five spinning resistive sensors, arranged to accurately represent the movement of each finger. The analog readings acquired by the ESP32 microcontroller are analyzed and correlated with the actual bending angles of the joints. The hand’s spatial orientation is concurrently ascertained using an MPU6050 inertial sensor, which combines an accelerometer and a triaxial gyroscope. This orientation is crucial for accurately depicting the hand’s position in virtual space.
After local data processing, the ESP32 microcontroller relays the information to the ESP8266 module through a WiFi connection, employing a UDP communication protocol. The ESP8266 acquires this data and relays it to the PC using the USB serial port. The program created in Unreal Engine 5 processes this data in real time and produces an accurate representation of the user’s hand movements within the virtual environment.
The system is bidirectional, facilitating the transfer of actual motions within the virtual environment and the return of commands from the computer to the physical glove. The computer transmits positioning orders for the five finger-mounted servo motors via ESP8266 and subsequently through ESP32. The PWM signals, generated by the ESP32, regulate them according to data received from the VR application.
The data flow between components is constant and coordinated, ensuring minimal latency and a seamless user experience. The system’s control and command rely on a cohesive integration of hardware components (sensors, microcontrollers, actuators) and the software environment, enabling the translation of real movements into an interactive virtual environment.
The system’s fundamental element is the ESP32-WROOM-32D microcontroller (Espressif Systems, Shanghai, China), a multifunctional and high-performance module including a dual-core Xtensa LX6 CPU (Espressif Systems, Shanghai, China) operating at a frequency of up to 240 MHz. The platform provides a substantial quantity of input/output (I/O) pins, with 38 accessible on the version utilized in this project. The ESP32 features comprehensive support for digital communications (I2C, SPI, UART) and incorporates a 12-bit resolution analog-to-digital converter (ADC) [24].
The ESP32 within the system gathers and interprets sensor data while simultaneously controlling the servo motors’ positions in real time, according to incoming commands or locally generated data. This module can connect efficiently with other microcontrollers or mobile devices due to its native WiFi capability and strong processing capacity, eliminating the need for extra network components.
The NodeMCU ESP8266 module serves as an interface between the ESP32 and the computer executing the program created in Unreal Engine 5. In contrast to the ESP32, the ESP8266 is a more basic, single-core microcontroller operating at 80 MHz (or 160 MHz in turbo mode); however, it provides reliable WiFi connectivity and an adequate number of I/O pins for bidirectional data transmission to and from the PC via the USB serial port. The available number of pins is quite limited, around 11 functional digital pins, although this is adequate for the intermediary role it fulfills within the system.
The integration of ESP32 with ESP8266 modules constitutes a cost-effective, low-complexity, and performance-efficient solution.
The electronics utilize the ESP32-WROOM module, incorporating a sophisticated data acquisition and driving system that integrates five resistive rotary sensors, five SG90 servo motors, and an MPU6050 module.
The connections were established to facilitate concurrent and hardware-independent operation among the modules linked to the microcontroller, utilizing the designated pins for each function: analog (ADC), PWM signal, and I2C communication.
Resistive rotating sensors function as position sensors to detect finger movement, generating analog signals that are proportional to the rotation of their spindle. The signals are linked to the analog inputs of the ESP32, especially to the GPIO32, GPIO33, GPIO34, GPIO35, and GPIO39 pins. The selection of these pins is due to their association with the ADC1 block, which is favored for analog signal acquisition as it maintains functionality and stability throughout WiFi activities.
GPIO34, GPIO35, and GPIO39 are pins designated just for input, rendering them optimal for capturing analog signals from resistive rotating sensors, as no digital output is required. Connecting the resistive rotating sensors requires providing them with 3.3 V, linking the common ground (GND) to the system’s reference, and connecting the center pin to the specified analog inputs.
The control of the five SG90 servo motors utilized pins GPIO21, GPIO5, GPIO17, GPIO4, and GPIO27. These ports may provide PWM signals, essential for regulating the position of the servo motors. The ESP32 facilitates the generation of precise PWM signals over numerous pins, selected to avoid interference with the board’s essential boot or communication operations. The control signal for each servo motor is transmitted to the respective pin of the ESP32.
The MPU6050 module, comprising an accelerometer and a 6-axis gyroscope, utilized the I2C protocol, implemented on the ESP32′s GPIO18 (SDA) and GPIO19 (SCL) pins. The microcontroller’s conventional I2C pins are typically supported by most libraries utilized for the MPU6050. The module operates at 3.3 V, with a shared ground with the circuit, assuring logical compatibility and optimal functionality of the I2C bus.
Consequently, the allocation of the pins in this configuration was executed by taking into account the functionalities of each pin on the ESP32 and the particular specifications of the linked components. The objective was to utilize the ADC1 block solely for acquiring data from resistive rotating sensors, to designate PWM pins that do not play a crucial part in the boot process for servo control, and to adhere to conventional I2C protocols for communication with the MPU6050. This produces a cohesive, resilient architecture that aligns with the demands of interactive or robotic applications utilizing this arrangement.
This setting ensures the components’ power supply is intended for dependable operation and prevents overloading the internal regulator of the ESP32 board. The three primary component groups (ESP32, resistive rotary sensors, and servo motors) had distinct power supply requirements yet are interconnected via a shared ground point (GND), essential for the system’s optimal functionality.
The ESP32-WROOM board receives power through its input connector (by USB) and, when mounted on the breadboard, directly provides the 3.3 V necessary to power the resistive rotating sensors and the MPU6050 module. The voltage is sourced from the 3.3 V pin of the ESP32 board, which is linked to the positive rail of the upper section of the breadboard. The resistive rotating sensors function in passive mode and use minimal current, allowing the ESP32 to directly supply the five sensors without concerns of heat or voltage drop issues.
SG90 servo motors exhibit significantly higher power consumption, particularly under load, with each motor potentially requiring currents of 250–500 mA or greater in lock-up mode. Consequently, their power supply is derived not from the 5 V pin of the ESP32 board, but rather from an external 5 V source.
It is important to acknowledge that all metrics (GND) are interconnected. The common ground point is crucial for the control signals (PWM) transmitted by the ESP32 to the servo motors, as it provides a shared reference; without it, the motors would fail to correctly identify the positioning pulses. The absence of a uniform mass may result in inaccurate or entirely dysfunctional engine actions.

2.4. System Programming

The system consists of three applications operating concurrently to replicate movements in virtual space while also receiving commands from it. The ESP32 and ESP8266 microcontrollers are programmed via Arduino and the C++ (Visual Studio 2022) programming language. The software operating on the computer is developed in Unreal Engine 5, utilizing a method particular to this platform, specifically Blueprint. In Unreal Engine, Blueprints constitute a visual programming framework enabling developers to formulate game logic without the necessity of coding in C++. This technology is integral to the Unreal Engine architecture and has been developed to offer a straightforward and cost-effective solution for both developers lacking programming expertise and those seeking to rapidly construct working prototypes. Blueprints have a node-based graphical interface, wherein each node signifies a function, event, variable, or logical action. Nodes are interconnected by “wires” (connections), which delineate the execution flow and the relationships among data. This system enables the modeling of player behavior, the creation of graphical user interfaces (UI), collision management, object animation, game logic definition, and additional functionalities.
The program developed for the ESP32 is responsible for initializing the pins connected to physical components, including resistive rotary sensors, servo motors, and the MPU6050 motion sensor. It configures the PWM channels required for controlling the five servo motors, each corresponding to a finger of the glove, and creates an I2C connection with the orientation sensor.
Throughout execution, the microcontroller persistently acquires the analog values from the resistive rotary sensors, which indicate the precise positions of the fingers. The values are transformed into a range that corresponds to the PWM control signals, enabling the servo motors to follow the movement of the fingers.
Data from the accelerometer and gyroscope are concurrently analyzed to determine hand orientation, employing a complementary filter for enhanced stability and precision.
All this information is conveyed using the UDP protocol to an ESP8266 module linked to a computer. The application also enables the reception of control commands.
The data communicated from ESP32 to ESP8266 is formatted as [a; b; c; d; e; f; g; h; i; j; k], where a, b, c, d, and e represent readings from resistive rotating sensors; e, f, and g denote values for inclination, roll, and gyration; and i, j, and k indicate the positional data of the hand. The data obtained from Unreal Engine via the ESP8266 module is formatted as [a; b; c; d; e], with each value representing a finger of the hand. The readings may reach 100 when the user fully flexes the finger and drop to 0 when the finger is completely extended. In standard operation, the servo motors monitor the position of the fingers based on the readings from the resistive rotary sensors. When the system identifies that a virtual object is gripped (value 100), the movement of the servo motors is suitably constrained, inhibiting further finger movement and thereby imitating tactile resistance. Consequently, the application establishes a comprehensive interface between the individual donning the glove and the digital or robotic environment, enabling seamless and instantaneous interaction.
The ESP8266 application aims to provide bidirectional communication between the ESP32 and the Unreal Engine virtual environment via the WiFi network and the UDP protocol. Initially, the WiFi connection is formed with the local network’s name and password. Subsequently, two UDP channels are established: one for receiving data from the ESP32 and another for transmitting data back to it, contingent upon orders received from Unreal Engine via the serial port. The module receives messages transmitted over UDP from the ESP32 in the main loop and forwards them to Unreal through the serial interface. Concurrently, it acquires serial data from Unreal (in textual format) and transmits it to the ESP32 over UDP, thereby facilitating the synchronization of physical and virtual activities.
The Unreal Engine software is more intricate; nonetheless, we will concentrate solely on some key components. The initial aspect is communicating with the ESP over the serial port. The Serial COM plugin for Unreal Engine 5 facilitates bidirectional communication between the graphics engine and an external device via the serial port. It enables the establishment of a serial connection, the transmission and reception of data in text or binary format, and its incorporation into the application logic through the use of Blueprints nodes. Consequently, data from sensors or external orders may be utilized to manipulate virtual objects, while Unreal can transmit responses to the physical device. Select the port and communication frequency (115,200 for ESP32), then initiate the communication. Upon receipt of data over the serial port, it is parsed—Figure 10, transforming the raw string of characters into a structured format that is readily interpretable and used within the application.
After receiving the string, the system checks that it complies with the default format of type {a; b; c; d; e; f; g; h; i; j; k}. If this condition is met, the string is divided into an array of individual components, using the “;” symbol as a delimiter. Subsequently, each element in the array is accessed by its position and converted from textual format (string) to numeric format (float) so that it can be used in calculations, controlling movements or updating states in the virtual environment. This process ensures a correct and precise interpretation of the transmitted data. For the position of the fingers, the values from 0 to 4 are used, which represent the data from the resistive rotating sensors; it is checked that they belong to the range of values for the movement of the fingers, and the position in the virtual environment is set—Figure 11.
The hand’s orientation is ascertained by employing a comparable procedure that involves extracting data from the MPU 6050, located at array positions 5, 6, and 7. The values are modified by multiplying by −1 when necessary to accurately align the physical orientation of the hand with its representation in the virtual world (Figure 12), and thereafter assigned to the relevant parameters of the virtual model.
The virtual environment has two primary entities known as actors. The primary actor is the hand. The secondary principal actor is the thing that can be manipulated with the glove.
In the virtual world established in Unreal Engine, the virtual hand is depicted by an actor consisting of multiple interrelated components, each designated for particular functions in its rendering and control.
The fixed configuration of the hand, designated as “VR_Hand_Right”, signifies the static 3D model employed for the overarching display of the hand in virtual environments. The corresponding blueprint, designated “VR_Hand_Right_BP”, encompasses the functional and behavioral logic of the hand, facilitating interactions and responses to external orders. The physics element, ‘VR_Hand_Right_Physics’, delineates the object’s physical attributes, including collision dynamics and force responses, and is crucial for authentic modeling of interactions with other objects. The hand’s skeleton, referred to as “VR_Hand_Right_Skeleton,” supplies the requisite virtual bone framework for animation, while the “VR_Hand_Right_Skeleton_Animation” file encompasses the animations linked to this skeleton, facilitating the replication of natural finger and hand movements.
These components collaborate to establish a cohesive and engaging framework for depicting the hand in virtual reality. By utilizing the program’s components, we can identify collisions between the hand and the object within the virtual world and communicate orders to the servo motors to prevent the fingers from closing when the object is grasped.

3. Results

3.1. Robotic Arm and Gripper Manufacturing

One thing of particular importance is the choice of technology for the realization of robotic arms and gripper subassemblies for the manipulation of structures between or after thermal treatments. This selection depends on several criteria, including the starting material, gauge dimensions, working conditions, and other technical and economic factors. Currently, the mechanical structure of the robotic arm can be realized using three primary technological methods (Figure 13) with distinct working principles or, in certain instances, a combination of these methods: subtractive [31], formative [32], and additive [33].
The first option is represented by the subtractive category, namely using traditional CNC machining (milling, turning, etc.), which is very advantageous in terms of dimensional accuracy and the characteristics of the resulting elements, but the processing price is quite high due to the complexity of the equipment used and the tools in the case of processing hard materials. The second variant is forming manufacturing—casting or injection molding. In this scenario, the structures produced exhibit exceptional precision, and a diverse selection of materials is available, making it the ideal option for large-scale manufacturing. On the other hand, aa long time and important expenses are required for the execution and processing of the mold, and the flexibility in terms of geometric changes in the design is very limited and involves additional costs. The third category that can be chosen for the realization of the mechanical part of the robotic arm is additive manufacturing or rapid prototyping. This method is the most suitable when it comes to the combination of the special potential of personalization, flexibility, and adaptability of the design, as well as obtaining extremely complex and detailed forms, being more competitive from this point of view compared to the other two aforementioned technological approaches.
However, additive technologies are more susceptible to the post-processing aspects that are typically required, and the printed parts may exhibit mechanical characteristics that are inferior to those obtained through traditional techniques. In any case, there are currently additive manufacturing technologies and equipment that outperform traditional ones in producing components with superior characteristics compared to classic methods, and the field is continuously developing [34,35].
There are several additive manufacturing techniques and methods that utilize various operating principles [36]. Some techniques that can be used to make the application that was talked about are Fused Deposition Modeling (FDM) [37], Powder Bed Fusion (PBF) techniques, which include Selective Laser Sintering (SLS), Selective Laser Melting (SLM), Electron Beam Melting (EBM) [38], and Direct Metal Deposition (DMD) [39]. We primarily use the first method to materialize test models/prototypes, while we can use the last two to create the final product. Given that the given case involves testing a prototype to optimize the final solution and technology accessibility, the FDM technology option appears to be the most suitable and logical choice for the realization of the robotic arm. This additive manufacturing method uses a variety of thermoplastic filaments in various configurations with different characteristics and finds multiple applications in prototyping/test models and in other fields [40,41,42].
Figure 14 presents an example with the main types of sacrificial layers that can be used as supports in PrusaSlicer 2.7.4 software. Figure 15 shows the main types of infill pattern configurations that can be used with the Prusa i3 MK3 printer (25% fill density for all cases). These internal configurations during the filling of the layers play an important role in establishing the final properties of the printed structure [43]. For the presented application, the most suitable option is the infill pattern used in Gyroid due to the superior structural and mechanical properties that can be obtained, demonstrated by the latest studies in the specialized literature [44].
Figure 16 presents the base of the robotic arm on the virtual working platform of the 3D printer and the simulation of layer deposition. The estimated print time for this structure, which utilizes snug type support, is roughly 17 h. Additionally, the 3D printer’s program allows for the viewing of printing times and speeds based on the printing region (feature type), as shown in Figure 17. Since this is an initial prototype, we can use polylactic acid (PLA) as the material, as it is an accessible solution with suitable characteristics. You can also adopt the following key printing parameters for the first prototype: infill 15–20% (using Gyroid), 0.2 mm layer thickness with the quality system preset with a standard option of 0.4 mm nozzle diameter, extrusion temperature 210 °C (215 °C only for the first layer), bed temperature 60 °C. Regarding the supports, it is possible to apply grid, snug and organic types of supports in function of the most suitable and efficient way to realize the sacrificial layer for a given geometry.
The complete mechanical assembly of the robotic arm was engineered in SolidWorks, with each component meticulously modelled: arm segments, joints, motor mounts, and pliers. The objective was to achieve a small design that facilitates 3D printing while providing rigidity and stability. The modular design facilitated rapid modifications in response to the mounting requirements and constraints dictated by the 3D printer. The arm structure comprised 13 3D-printed components fabricated from PLA filament, selected for its rigidity and processing convenience. The boom’s movement is facilitated by four MG996R servo motors (Tower Pro Pte Ltd., Singapore) positioned at the joints and a stepper motor that rotates the base. The components are joined using screws, spacers, and gears, guaranteeing movement transmission and system stability. Certain components need human adjustment owing to the printer’s stringent tolerances. Figure 18 presents the 3D printed elements of the robotic system. Considering that the concept described in the paper is at the initial stage, the option was made to choose FDM technology, because it is the most cost-effective technology available, has a wide range of accessible thermoplastic materials that can be used for prototyping, and the possibility of creating test models with adequate mechanical and strength characteristics. Another advantage is scalability, but also the possibility of developing and testing multiple models simultaneously to choose the optimal variant.

3.2. Robotic Arm and Gripper Modelling for VR Controlling

Initially, the robotic arm is simulated using software comprising two components: a skeletal framework consisting of elements (Figure 19a) and joints, which collectively form a structure that accurately resembles the actual robotic arm (Figure 19b). The robot’s modular design allows for many reconfigurations (Figure 19c,d), providing significant functional flexibility and adaptability for diverse applications.
The subsequent actions involve exporting the model in FBX format (Figure 20a) and importing it into Unreal Engine (Figure 20b).
The imported model comprises three components: the skeleton mesh component (which facilitates the kinematics of the robotic arm), the relationship component, and the structural mesh component. Animation of the arm components will be developed to govern and animate characters or other objects (skeletal meshes) based on inputs, variables, or states that influence a character’s movement. Alter the orientations of the components constituting the robot’s body. Their structure is hierarchical; if the rotation of the primary element alters, the subsequent elements retain their local positions while their global positions shift. It assumes control of the final instance of the skeleton position and relays any alterations to the arm animation. Using the visual programming environment, the animation in the VR environment for arm elements is realized, as can be seen in Figure 21. The animation commences by enabling the input from the glove and is done according to the user data captured by the glove and transmitted to the VR environment, as can be seen in Figure 22. Each element is animated according to a finger movement (press action) until the target position is achieved, as can be seen in Figure 23. Similarly, the gripper is animated, as can be seen in Figure 24. This is illustrated by a collage of photographs (Figure 26) that exhibit the use of the physical user’s hand gestures to control the robot positioning system.
Example of moving the basic elements—Figure 25.
All the movements have been done using the gloves—Figure 26.

4. Conclusions

In this paper, we present human–machine interaction system utilizing a smart glove, proficient in detecting hand gestures and replicating them in real time within a virtual world and controlling a robotic arm that can manage thermally processed objects from laser treatment stations, through cryogenic treatment, and ultimately to the testing stand. The primary objective of this paper is to propose and present the concept and key features of the mechatronic platform, with the aim of implementing the physical experimental model.
The mechatronic platform consists of laser equipment, cryogenic equipment, two robotic positioning systems, the controlling unit, and an optical measurement stand. In this article, we exclusively concentrate on robotic positioning systems, specifically robotic arms and grippers, and we showcase two distinct types of these systems. We use 3D printing to manufacture the parts of robotic arms and grippers. The article presented an example of the parameters and key steps required to manufacture the base of the first robotic arm.
The proposed controlling device incorporates position sensors (resistive rotating sensors) affixed to each finger, an MPU6050 inertial module for ascertaining the overall orientation of the hand, and a series of SG90 servo motors to replicate the sensation of grasping. The system is orchestrated by two microcontrollers: an ESP32 on the mobile side (glove) and a stationary ESP8266, which interacts over the UDP protocol, facilitating an efficient bidirectional connection between the actual and virtual realms.
The system described is complex, requiring extensive examination and construction. We must optimize, build, and test positioning systems experimentally in the future. Furthermore, we must optimize the trajectory’s components in terms of hardware resource usage and user-friendliness and develop a user interface. Additionally, the measuring stand must integrate the software required to assess the quality of the laser.
The development prioritized critical elements of mechatronic design, including downsizing, energy efficiency, communication reliability, and system adaptability. The personalized calibration of the resistive rotating sensors enabled the glove to fit to various anatomical structures of the human hand, enhancing precision in ascertaining finger positioning.
The integration of the MPU6050 enhanced the simulation’s realism, enabling the system to replicate the hand’s three-dimensional orientation in space. Communication across ESPs was established with little latency, and real-time control of the servo motors exhibited a high degree of realism in response to actual user motions, which constitutes an advantage over other similar systems. The main constraint of the remote control is the low speed and repeatability, which are highly dependent on the user experience.
A notably significant facet of the project is the integration of the physical system with the virtual environment created in Unreal Engine. This integration enabled the visualization of movements in virtual space and the transmission of orders to the physical system, facilitating a bidirectional feedback interface—an essential component in advanced virtual reality, robotics, and medical rehabilitation applications.
The experimental findings validate the system’s proper operation, its capacity for real-time responses to multiple inputs, and its adaptability across many circumstances. The advanced smart glove serves as a working prototype for several research avenues or commercial applications, including VR simulations for professional training, video games, or myoelectrically controlled prosthetics.
In conclusion, the suggested idea enhances the notion of a natural interface between humans and cyber systems, marking a significant advancement in the development of portable, interactive, and precise technologies. The selected hardware–software architecture provides both performance and scalability, allowing for straightforward expansion with additional modules or functionalities, such as haptic feedback, intricate gesture detection, or machine learning for behavioral customization. The integrated approach illustrates both the feasibility of a functional technical solution and its capacity for conversion into a tangible product.
Notwithstanding the strong functionality and promising outcomes attained in the project, there exist several avenues for enhancement and expansion that can be pursued in the future, with the objective of augmenting the system’s performance, portability, and applicability in real-world and intricate situations.
A crucial developmental focus is the accurate identification of the hand’s position in three-dimensional space, beyond merely assessing the orientation and relative placement of the fingers. To do this, supplementary technologies such as stereo cameras, infrared (IR) sensor stations, or AprilTag-based motion detection systems may be incorporated, strategically placed within the environment to monitor the hand’s absolute position relative to a global reference system. This form of localization would facilitate more accurate interactions in augmented or virtual reality settings, as well as sophisticated manipulation applications, where precise positioning is crucial.
A crucial focus is the downsizing and enhanced integration of electrical and mechanical components. The system presently employs modular components (ESP32 board, MPU6050, SG90 servo motors, etc.), which, while efficient for development, take considerable space and may affect user comfort. The next phase of research will entail the design of bespoke circuit boards, tailored for compactness and efficiency, alongside the integration of miniature actuators utilizing linear servo mechanisms, shape-memory polymers, or diminutive pneumatic actuators. This integration procedure will result in the development of a portable, lightweight, ergonomic, and durable gadget, crucial for professional or commercial applications.
Alongside the hardware components, the software aspect could also undergo substantial enhancements. The deployment of sophisticated filters to enhance the precision of sensory input, together with the incorporation of machine learning algorithms for gesture identification and categorization, would facilitate novel avenues for natural interaction between the user and the system. A potential avenue for expansion is to enhance the communication protocol and minimize latency by transitioning from UDP to hybrid or specialized protocols for real-time applications (e.g., MQTT or ROS 2), contingent upon the application’s complexity and bandwidth constraints.
Another significant study is the investigation of the parameters and processes of thermal processing through an applicative study of the materials and surfaces to be processed.

Author Contributions

Conceptualization, A.D.A. and C.-G.A.; methodology, C.-G.A. and E.M.; software, M.-V.G. and A.D.A.; investigation, M.-I.N., M.-V.G. and E.M.; resources, M.-I.N. and A.D.A.; writing—review and editing, C.-G.A. and E.M.; supervision, C.-G.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National University of Science and Technology Politehnica Bucharest, grant GNAC ARUT 2023 number 73 from 12 October 2023, with the title “Laser Treatments Followed by Cooling at Low Temperatures,” acronym TTLRTJ.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Madhloom, M.A.; Ataiwi, A.H.; Dawood, J.J. Influence of cryogenic treatment on hardness, tensile properties, and microstructure of aluminum alloy AA6061. Mater. Today Proc. 2022, 60, 2157–2161. [Google Scholar] [CrossRef]
  2. Gruber, M.; Ploberger, S.; Ressel, G.; Wiessner, M.; Hausbauer, M.; Marsoner, S.; Ebner, R. Effects of the combined heat and cryogenic treatment on the stability of austenite in a high Co-Ni steel. Arch. Metall. Mater. 2015, 60, 2131–2137. [Google Scholar] [CrossRef]
  3. Gecu, R. Combined effects of cryogenic treatment and tempering on microstructural and tribological features of AISI H13 steel. Mater. Chem. Phys. 2022, 292, 126802. [Google Scholar] [CrossRef]
  4. Li, Z.; Wang, Y.; Wang, J.; Zhang, Y. Effect of Cryogenic Heat Treatment and Heat Treatment on the Influence of Mechanical, Energy, and Wear Properties of 316L Stainless Steel by Selective Laser Melting. JOM 2022, 74, 3855–3868. [Google Scholar] [CrossRef]
  5. Li, D.H.; He, W.C.; Zhang, X.; Xiao, M.G.; Li, S.H.; Zhao, K.Y.; Yang, M.S. Effects of traditional heat treatment and a novel deep cryogenic treatment on microstructure and mechanical properties of low-carbon high-alloy martensitic bearing steel. J. Iron Steel Res. Int. 2021, 28, 370–382. [Google Scholar] [CrossRef]
  6. Daggett, C.; Daggett, A.; McBurney, E.; Murina, A. Laser safety: The need for protocols. Cutis 2020, 106, 87–92. [Google Scholar] [CrossRef] [PubMed]
  7. Vega-Bosch, A.; Santamarina-Campos, V.; Colomina-Subiela, A.; Carabal-Montagud, M.-Á. Cryogenics as an Advanced Method of Cleaning Cultural Heritage: Challenges and Solutions. Sustainability 2022, 14, 1052. [Google Scholar] [CrossRef]
  8. JSC Safety and Health Requirements. Available online: https://www.nasa.gov/wp-content/uploads/2023/06/jpr1700-1ch6-4l.pdf (accessed on 10 September 2024).
  9. Zhang, Q.; Liu, Q.; Duan, J.; Qin, J. Research on Teleoperated Virtual Reality Human-Robot Five-Dimensional Collaboration System. Biomimetics 2023, 8, 605. [Google Scholar] [CrossRef]
  10. Luu, T.; Nguyen, Q.; Tran, T.; Tran, M.Q.; Ding, S.; Kua, J.; Hoang, T. Effectiveness of IoT and VR for Real-Time Teleoperation of Industrial Robots. Preprint (Version 1). Research Square. Available online: https://www.researchsquare.com/article/rs-3812859/v1 (accessed on 2 January 2024).
  11. Moniruzzaman, M.D.; Rassau, A.; Chai, D.; Islam, S.M.S. Teleoperation methods and enhancement techniques for mobile robots: A comprehensive survey. Robot. Auton. Syst. 2022, 150, 103973. [Google Scholar] [CrossRef]
  12. Lee, S.H.; Jeong, D.Y.; Kim, I.S.; Lee, D.I.; Cho, D.Y.; Lee, M.C. Control of robot manipulator for storing cord blood in cryogenic environments. In Proceedings of the 2009 ICCAS-SICE, Fukuoka, Japan, 18–21 August 2009; IEEE: New York, NY, USA, 2009; pp. 4256–4259. [Google Scholar]
  13. French, R.; Marin-Reyes, H.; Kourlitis, E. Usability study to qualify a dexterous robotic manipulator for high radiation environments. In Proceedings of the 2016 IEEE 21st International Conference on Emerging Technologies and Factory Automation (ETFA), Berlin, Germany, 6–9 September 2016; IEEE: New York, NY, USA, 2016; pp. 1–6. [Google Scholar]
  14. Li, X.; Yue, H.; Yang, D.; Sun, K.; Liu, H. A large-scale inflatable robotic arm toward inspecting sensitive environments: Design and performance evaluation. IEEE Trans. Ind. Electron. 2023, 70, 12486–12499. [Google Scholar] [CrossRef]
  15. Andrade, E.; Cerecerez, G.; Garzòn, M.; Quito, A. Design and implementation of a robotic arm prototype for a streamlined small chocolate packaging process. Eng. Proc. 2023, 47, 1. [Google Scholar]
  16. Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. Closed-loop robotic arm manipulation based on mixed reality. Appl. Sci. 2022, 12, 2972. [Google Scholar] [CrossRef]
  17. Herbst, F.; Suppelt, S.; Schäfer, N.; Chadda, R.; Kupnik, M. HELENE: Six-Axis Accessible Open-Source 3D-Printed Robotic Arm for Research and Education. Hardware 2025, 3, 7. [Google Scholar] [CrossRef]
  18. Sokovic, M.; Bozic, D.; Lukic, D.; Milosevic, M.; Sokac, M.; Santosi, Z. Physical Adaptation of Articulated Robotic Arm into 3D Scanning System. Appl. Sci. 2025, 15, 5377. [Google Scholar] [CrossRef]
  19. Michalik, P.; Molnár, V.; Fedorko, G.; Stehlíková, B.; Tirpak, P.; Macej, J. Research on the Influence of Production Technologies on the Positioning Accuracy of a Robotic Arm for Low-Handling Weights. Appl. Sci. 2021, 11, 6104. [Google Scholar] [CrossRef]
  20. Yifeng, G.; Ge, S.; Nair, A.; Bidwai, A.; Raghuram, C.S.; Grezmak, J.; Sartoretti, G.; Daltorio, K.A. Legged robots for object manipulation: A review. Front. Mech. Eng. 2023, 9, 1142421. [Google Scholar] [CrossRef]
  21. Jiawei, M.; Buzzatto, J.; Yuanchang, L.; Liarokapis, M. On Aerial Robots with Grasping and Perching Capabilities: A Comprehensive Review. Front. Robot. AI 2022, 8, 739173. [Google Scholar] [CrossRef]
  22. Mazzeo, A.; Aguzzi, J.; Calisti, M.; Canese, S.; Vecchi, F.; Stefanni, S.; Controzzi, M. Marine Robotics for Deep-Sea Specimen Collection: A Systematic Review of Underwater Grippers. Sensors 2022, 22, 648. [Google Scholar] [CrossRef] [PubMed]
  23. Baohua, Z.; Xie, Y.; Zhou, J.; Wang, K.; Zhang, Z. State-of-the-art robotic grippers, grasping and control strategies, as well as their applications in agricultural robots: A review. Comput. Electron. Agric. 2020, 177, 105694. [Google Scholar]
  24. Bajaj, N.M.; Spiers, A.J.; Dollar, A.M. State of the Art in Artificial Wrists: A Review of Prosthetic and Robotic Wrist Design. IEEE Trans. Robot. 2019, 35, 261–277. [Google Scholar] [CrossRef]
  25. Hernandez, J.; Sunny, M.S.H.; Sanjuan, J.; Rulik, I.; Zarif, M.I.I.; Ahamed, S.I.; Ahmed, H.U.; Rahman, M.H. Current Designs of Robotic Arm Grippers: A Comprehensive Systematic Review. Robotics 2023, 12, 5. [Google Scholar] [CrossRef]
  26. Birglen, L.; Schlicht, T. A statistical review of industrial robotic grippers. Robot. Comput. Integr. Manuf. 2018, 49, 88–97. [Google Scholar] [CrossRef]
  27. Bevin, P.M.; Devasia, F.; Asok, A.; Jayadevu, P.R.; Baby, R. Implementation of an origami inspired gripper robot for picking objects of variable geometry. Mater. Today: Proc. 2022, 58, 176–183. [Google Scholar] [CrossRef]
  28. Grămescu, B.; Cartal, L.A.; Hashim, A.S.; Nițu, C. Clamping Mechanisms of an Inspection Robot Working on External Pipe Surface. In ICOMECYME 2019: Proceedings of the International Conference of Mechatronics and Cyber-MixMechatronics 2019; Springer: Berlin/Heidelberg, Germany, 2019; pp. 218–230. [Google Scholar]
  29. Ciobanu, R.; Rizescu, C.I.; Rizescu, D.; Gramescu, B. Surface Durability of 3D-Printed Polymer Gears. Appl. Sci. 2024, 14, 2531. [Google Scholar] [CrossRef]
  30. Zhang, S.; Xia, Q.; Chen, M.; Cheng, S. Multi-Objective Optimal Trajectory Planning for Robotic Arms Using Deep Reinforcement Learning. Sensors 2023, 23, 5974. [Google Scholar] [CrossRef]
  31. Tung, T.T.; Van Tinh, N.; Thao, D.T.P.; Minh, T.V. Development of a prototype 6 degree of freedom robot arm. Results Eng. 2023, 18, 101049. [Google Scholar] [CrossRef]
  32. Bell, M.A.; Becker, K.P.; Wood, R.J. Injection molding of soft robots. Adv. Mater. Technol. 2022, 7, 2100605. [Google Scholar] [CrossRef]
  33. Siemasz, R.; Tomczuk, K.; Malecha, Z. 3D printed robotic arm with elements of artificial intelligence. Procedia Comput. Sci. 2020, 176, 3741–3750. [Google Scholar] [CrossRef]
  34. Ren, X.P.; Li, H.Q.; Guo, H.; Shen, F.L.; Qin, C.X.; Zhao, E.T.; Fang, X.Y. A comparative study on mechanical properties of Ti–6Al–4V alloy processed by additive manufacturing vs. traditional processing. Mater. Sci. Eng. A 2021, 817, 141384. [Google Scholar] [CrossRef]
  35. Gao, B.; Zhao, H.; Peng, L.; Sun, Z. A review of research progress in selective laser melting (SLM). Micromachines 2022, 14, 57. [Google Scholar] [CrossRef] [PubMed]
  36. Prabhakar, M.M.; Saravanan, A.K.; Lenin, A.H.; Mayandi, K.; Ramalingam, P.S. A short review on 3D printing methods, process parameters and materials. Mater. Today Proc. 2021, 45, 6108–6114. [Google Scholar] [CrossRef]
  37. Wickramasinghe, S.; Do, T.; Tran, P. FDM-based 3D printing of polymer and associated composite: A review on mechanical properties, defects and treatments. Polymers 2020, 12, 1529. [Google Scholar] [CrossRef]
  38. Singh, D.D.; Mahender, T.; Reddy, A.R. Powder bed fusion process: A brief review. Mater. Today Proc. 2021, 46, 350–355. [Google Scholar] [CrossRef]
  39. Wang, J.J.; Liu, E.; Tor, S.B.; Chua, C.K. Metal 3D printing via selective laser melting and direct metal deposition: Materials, properties and applications. In Proceedings of the 2nd International Conference on Progress in Additive Manufacturing (Pro-AM 2016), Singapore, 16–19 May 2016; pp. 270–275. [Google Scholar]
  40. Rizescu, C.I.; Besnea, D.; Rizescu, D.; Moraru, E.; Constantin, V. Mechanical analysis of leaf springs realized by additive technologies. In Advances in Manufacturing II: 1-Solutions for Industry 4.0; Springer International Publishing: New York, NY, USA, 2019; pp. 307–318. [Google Scholar]
  41. Rodica, S.A.; Constantin, V. The using of additive manufacturing for prototype production of moulds. Int. J. Mechatron. Appl. Mech 2017, 1, 7–11. [Google Scholar]
  42. Tümer, E.H.; Erbil, H.Y. Extrusion-based 3D printing applications of PLA composites: A review. Coatings 2021, 11, 390. [Google Scholar] [CrossRef]
  43. Besnea, D.; Rizescu, C.I.; Spanu, A.; Moraru, E.; Rizescu, D.; Constantin, V.; Panait, I. Research regarding the influence of different printing strategies on the mechanical characteristics of some FDM manufactured structures. IOP Conf. Ser. Mater. Sci. Eng. 2022, 1262, 012032. [Google Scholar] [CrossRef]
  44. Silva, C.; Pais, A.I.; Caldas, G.; Gouveia, B.P.; Alves, J.L.; Belinha, J. Study on 3D printing of gyroid-based structures for superior structural behaviour. Prog. Addit. Manuf. 2021, 6, 689–703. [Google Scholar] [CrossRef]
Figure 1. Educational testing mechatronic platform.
Figure 1. Educational testing mechatronic platform.
Applsci 15 12157 g001
Figure 2. Methodology framework.
Figure 2. Methodology framework.
Applsci 15 12157 g002
Figure 3. 3D model of robotic positioning system.
Figure 3. 3D model of robotic positioning system.
Applsci 15 12157 g003
Figure 4. Kinematic scheme of robotic positioning system: (a) the components of a manipulating robot; (b) the inverse kinematic scheme. Blue in (b) is the same sample of mass M as (a). The arrows are illustrating that the joint A and B are rotating.
Figure 4. Kinematic scheme of robotic positioning system: (a) the components of a manipulating robot; (b) the inverse kinematic scheme. Blue in (b) is the same sample of mass M as (a). The arrows are illustrating that the joint A and B are rotating.
Applsci 15 12157 g004
Figure 5. Controlling gloves.
Figure 5. Controlling gloves.
Applsci 15 12157 g005
Figure 6. Diagram illustrating the idea of the entire assembly.
Figure 6. Diagram illustrating the idea of the entire assembly.
Applsci 15 12157 g006
Figure 7. Software for finger kinematics in Unreal Engine.
Figure 7. Software for finger kinematics in Unreal Engine.
Applsci 15 12157 g007
Figure 8. Open and closed positions of the virtual hand.
Figure 8. Open and closed positions of the virtual hand.
Applsci 15 12157 g008
Figure 9. Angular positioning system.
Figure 9. Angular positioning system.
Applsci 15 12157 g009
Figure 10. Data Parsing.
Figure 10. Data Parsing.
Applsci 15 12157 g010
Figure 11. Setting the position of a virtual finger.
Figure 11. Setting the position of a virtual finger.
Applsci 15 12157 g011
Figure 12. Setting the position of the hand in the virtual model.
Figure 12. Setting the position of the hand in the virtual model.
Applsci 15 12157 g012
Figure 13. Main methods and technologies for manufacturing the mechanical structure of robotic arms.
Figure 13. Main methods and technologies for manufacturing the mechanical structure of robotic arms.
Applsci 15 12157 g013
Figure 14. Types of the sacrificial layers (support layers) from 3D printer (Prusa i3 MK3)software: (a)—grid type; (b)—snug type; (c)—organic type.
Figure 14. Types of the sacrificial layers (support layers) from 3D printer (Prusa i3 MK3)software: (a)—grid type; (b)—snug type; (c)—organic type.
Applsci 15 12157 g014
Figure 15. Types of infill pattern configuration (25% fill density for all cases).
Figure 15. Types of infill pattern configuration (25% fill density for all cases).
Applsci 15 12157 g015
Figure 16. Base of the robotic arm on the virtual working platform of the 3D printer and the simulation of layer deposition.
Figure 16. Base of the robotic arm on the virtual working platform of the 3D printer and the simulation of layer deposition.
Applsci 15 12157 g016
Figure 17. Estimated printing times and speeds in different regions.
Figure 17. Estimated printing times and speeds in different regions.
Applsci 15 12157 g017
Figure 18. Printed elements of the robotic arm: (a)—view from above of the robotic arm without gripper; (b)—view from above of the robotic arm with gripper; (c)—view of the elements of the robotic arm.
Figure 18. Printed elements of the robotic arm: (a)—view from above of the robotic arm without gripper; (b)—view from above of the robotic arm with gripper; (c)—view of the elements of the robotic arm.
Applsci 15 12157 g018
Figure 19. Simulation of the robotic arm: (a)—robotic arm with the structure of 3 elements; (b)—robotic arm with the 3 elements; (c)—robotic arm with the 4elements; (d)—robotic arm with the 5 elements.
Figure 19. Simulation of the robotic arm: (a)—robotic arm with the structure of 3 elements; (b)—robotic arm with the 3 elements; (c)—robotic arm with the 4elements; (d)—robotic arm with the 5 elements.
Applsci 15 12157 g019
Figure 20. Exporting the model in FBX format and importing it into the Unreal Engine: (a)—selecting the FBX format from menu File/Export; (b)—selecting the import from menu File/Import.
Figure 20. Exporting the model in FBX format and importing it into the Unreal Engine: (a)—selecting the FBX format from menu File/Export; (b)—selecting the import from menu File/Import.
Applsci 15 12157 g020
Figure 21. Software for the animation of the arm.
Figure 21. Software for the animation of the arm.
Applsci 15 12157 g021
Figure 22. Software for data transmission from the glove to control and command the arm.
Figure 22. Software for data transmission from the glove to control and command the arm.
Applsci 15 12157 g022
Figure 23. Software for animating one element of the robotic arm.
Figure 23. Software for animating one element of the robotic arm.
Applsci 15 12157 g023
Figure 24. Software for animating the gripper.
Figure 24. Software for animating the gripper.
Applsci 15 12157 g024
Figure 25. Example of moving the basic elements: (a)—Initial position of the base element; (b)—clockwise rotation of the base element; (c)—rotation of the second item in the hierarchy; (d)—final position of the second item in the hierarchy; (e)—rotation of the third item in the hierarchy; (f)—final position of the third item in the hierarchy; (g)—the elements of the gripper have translational motion and move together; (h)—final position of the gripper elements.
Figure 25. Example of moving the basic elements: (a)—Initial position of the base element; (b)—clockwise rotation of the base element; (c)—rotation of the second item in the hierarchy; (d)—final position of the second item in the hierarchy; (e)—rotation of the third item in the hierarchy; (f)—final position of the third item in the hierarchy; (g)—the elements of the gripper have translational motion and move together; (h)—final position of the gripper elements.
Applsci 15 12157 g025aApplsci 15 12157 g025b
Figure 26. The virtual control of the robot using a glove.
Figure 26. The virtual control of the robot using a glove.
Applsci 15 12157 g026
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Alionte, C.-G.; Moraru, E.; Alionte, A.D.; Gheorghe, M.-V.; Nistor, M.-I. Manufacturing and Control of a Robotic Arm Used in an Educational Mechatronic Platform for Laser Treatments, Followed by Cooling at Low Temperatures. Appl. Sci. 2025, 15, 12157. https://doi.org/10.3390/app152212157

AMA Style

Alionte C-G, Moraru E, Alionte AD, Gheorghe M-V, Nistor M-I. Manufacturing and Control of a Robotic Arm Used in an Educational Mechatronic Platform for Laser Treatments, Followed by Cooling at Low Temperatures. Applied Sciences. 2025; 15(22):12157. https://doi.org/10.3390/app152212157

Chicago/Turabian Style

Alionte, Cristian-Gabriel, Edgar Moraru, Andreea Dana Alionte, Marius-Valentin Gheorghe, and Mircea-Iulian Nistor. 2025. "Manufacturing and Control of a Robotic Arm Used in an Educational Mechatronic Platform for Laser Treatments, Followed by Cooling at Low Temperatures" Applied Sciences 15, no. 22: 12157. https://doi.org/10.3390/app152212157

APA Style

Alionte, C.-G., Moraru, E., Alionte, A. D., Gheorghe, M.-V., & Nistor, M.-I. (2025). Manufacturing and Control of a Robotic Arm Used in an Educational Mechatronic Platform for Laser Treatments, Followed by Cooling at Low Temperatures. Applied Sciences, 15(22), 12157. https://doi.org/10.3390/app152212157

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop