Next Article in Journal
Development of a Coastal Erosion Monitoring Plan Using In Situ Measurements and Satellite Images
Next Article in Special Issue
Advanced Ultrasonic Diagnostics for Restoration: Effectiveness of Natural Consolidants on Painted Surfaces
Previous Article in Journal
Turbulence and Windshear Study for Typhoon Wipha in 2025
Previous Article in Special Issue
Comparative Analysis of Properties and Behaviour of Scaffolding Joints and Anchors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

High-Frame-Rate Camera-Based Vibration Analysis for Health Monitoring of Industrial Robots Across Multiple Postures

Graduate School of Advanced Science and Engineering, Hiroshima University, Hiroshima 739-8527, Japan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(23), 12771; https://doi.org/10.3390/app152312771
Submission received: 7 October 2025 / Revised: 23 November 2025 / Accepted: 28 November 2025 / Published: 2 December 2025
(This article belongs to the Special Issue Innovative Approaches to Non-Destructive Evaluation)

Abstract

Accurate vibration measurement is crucial for maintaining the performance, reliability, and safety of automated manufacturing environments. Abnormal vibrations caused by faults in gears or bearings can degrade positional accuracy, reduce productivity, and, over time, significantly impair production efficiency and product quality. Such vibrations may also disrupt supply chains, cause financial losses, and pose safety risks to workers through collisions, falling objects, or other operational hazards. Conventional vibration measurement techniques, such as wired accelerometers and strain gauges, are typically limited to a few discrete measurement points. Achieving multi-point measurements requires numerous sensors, which increases installation complexity, wiring constraints, and setup time, making the process both time-consuming and costly. The integration of high-frame-rate (HFR) cameras with Digital Image Correlation (DIC) enables non-contact, multi-point, full-field vibration measurement of robot manipulators, effectively addressing these limitations. In this study, HFR cameras were employed to perform non-contact, full-field vibration measurements of industrial robots. The HFR camera recorded the robot’s vibrations at 1000 frames per second (fps), and the resulting video was decomposed into individual frames according to the frame rate. Each frame, with a resolution of 1920 × 1080 pixels, was divided into 128 × 128 pixel blocks with a 64-pixel stride, yielding 435 sub-images. This setup effectively simulates the operation of 435 virtual vibration sensors. By applying mask processing to these sub-images, eight key points representing critical robot components were selected for multi-point DIC displacement measurements, enabling effective assessment of vibration distribution and real-time vibration visualization across the entire manipulator. This approach allows simultaneous capture of displacements across all robot components without the need for physical sensors. The transfer function is defined in the frequency domain as the ratio between the output displacement of each robot component and the input excitation applied by the shaker mounted on the end-effector. The frequency–domain transfer functions were computed for multiple robot components, enabling accurate and full-field vibration analysis during operation.

1. Introduction

Precise vibration measurement is critical for ensuring the performance, reliability, and safety of automated manufacturing systems. In smart factories, industrial robots increasingly perform tasks traditionally carried out by humans, such as welding, picking, cutting, and milling [1,2,3,4]. Repetitive operations can lead to mechanical wear, including insufficient lubrication, bearing faults, loose belts, and worn gears [5,6], which over time create discrepancies between commanded and actual robot positions, reducing control accuracy [7,8]. As the interaction between robots and human workers increases in smart factories [9], the risk of unexpected machine failures also rises [10,11,12]. Preventive maintenance reduces the frequency of equipment failures [13,14], thereby enhancing system reliability and extending the operational lifespan of machinery [15]. Moreover, even minor failures of robot components in such environments can halt supply chains, disrupt production lines, and result in significant economic losses [16,17,18]. By monitoring vibrations, including those caused by bearing failures and gear faults [17,19,20], machine defects can be detected at an early stage. Early identification of abnormal machine behavior is essential to prevent serious accidents and risks, allowing the detection of abnormal vibrations before they reach a critical level. The development of HFR vision systems enables the capture of images at hundreds of thousands of frames per second [21,22,23,24].
Vibration visualization is essential for precision operations because even small vibrations can cause positioning errors, reduce machining accuracy, and result in assembly misalignments. Visualizing vibrations enables the identification of dynamic disturbance propagation through robot components, early detection of abnormal behaviors, and real-time evaluation of structural integrity. These insights facilitate the optimization of control parameters, improvement in motion accuracy, and prevention of performance degradation during precision operations. To address these challenges, a vibrating machine was mounted on the end effector of a robot manipulator to induce controlled vibrations. In this study, a HFR camera was used to capture the robot’s vibrations at a resolution of 1920 × 1080 pixels, with an exposure time of 1000 µs at 1000 fps. To minimize the influence of environmental factors such as temperature and illumination variations, all experiments were conducted in a temperature-controlled laboratory under stable ambient lighting conditions. Vibration data were recorded for three distinct postures to evaluate dynamic responses under different configurations, ensuring reproducibility and consistency of the measurement conditions. Furthermore, a real-time vibration visualization system was developed to estimate and display vibration distributions across a wide frequency range. The proposed method addresses the limitations of conventional vibration monitoring approaches, which often rely on contact sensors that provide only single-point measurements. By integrating sub-pixel, full-field DIC with STFT analysis, the system simultaneously visualizes amplitude variations across multiple regions of the robot components. Moreover, the proposed method enables time–domain and frequency–domain analysis of multi-point vibrations across various robot configurations, without the need for physical sensors. This approach allows detection of configuration-dependent resonances and the estimation of transfer functions for individual components, establishing a foundation for non-contact, posture-aware robot health monitoring in smart factory environments. Next, Section 2 provides an overview of prior research on vibration analysis using both physical sensors and Digital Image Correlation (DIC). Section 3 outlines the methodologies employed in this study, including the DIC procedure, time–domain vibration analysis, and frequency–domain characterization techniques. Section 4 presents the experimental results, encompassing multi-point DIC measurements, frequency–domain responses, and transfer function estimation. Section 5 offers a detailed discussion of the findings and associated limitations. Section 6 summarizes the main conclusions of this study, followed by a discussion of recommended directions for future research.

2. Related Works

2.1. Vibration Analysis with Sensors

Over the past several decades, sensor-based vibration analysis has been widely employed to evaluate the operational condition and structural integrity of industrial robots, providing critical insights into component health and overall system performance. The vibration characteristics of robot manipulators are typically monitored using physical sensors such as strain gauges, accelerometers [25,26], and acoustic sensors [27], which are generally classified into two categories: wired and wireless. Conventional wired vibration measurement techniques primarily rely on sensors, such as accelerometers and strain gauges, to monitor structural and mechanical responses. However, they are limited to specific measurement points. Multi-point measurements require numerous individual installations, increasing system complexity and maintenance costs. Ryuta et al. [28] performed tests at 48 locations with five repetitions each, totaling 240 hammerings to obtain consistent results. Wired sensors are further constrained by limited accessibility, narrow measurement ranges [29,30], and susceptibility to vibration-induced errors in inertial and gyroscopic sensors [31]. Moreover, sensor attachment adds mass and stiffness at the mounting points, potentially altering natural force transmission and distorting the true vibration response [32]. Physical contact can also restrict robot motion and complicate setup in confined spaces. Additionally, hammer-based excitation methods often fail to cover the full detectable frequency range [33], making reliable data acquisition challenging. With the advancement of smart factory technologies, the demand for wireless and non-contact vibration monitoring has significantly increased [34]. Wireless vibration monitoring technologies encompass Laser Doppler Vibrometers (LDVs), acoustic sensing methods, and Micro-Electro-Mechanical Systems (MEMS) acoustic sensors. LDVs provide high-precision, non-contact vibration measurement by detecting Doppler frequency shifts from scattered laser light [35,36]. However, their performance strongly depends on surface reflectivity and optical path stability, making installation, alignment, and calibration difficult in industrial robot environments [37]. Acoustic-based sensing methods have been applied in fields such as pipelines, hydro-generators [38], and rotating machinery [39]. Their performance is influenced by the signal-to-noise ratio (SNR), which determines the minimum detectable sound pressure [40]. In industrial settings, echoes, overlapping sound sources, and strong background mechanical noise impair early fault detection [41], and abnormal acoustic signatures typically appear only after considerable wear or damage [42]. MEMS accelerometers are increasingly used in Structural Health Monitoring (SHM) to capture dynamic responses. When integrated into Wireless Sensor Networks (WSNs), they enable distributed, real-time vibration monitoring [43,44]. However, wireless systems face challenges such as limited battery life, lower transmission rates compared to wired systems, synchronization difficulties, and performance degradation due to communication delays, power constraints, and sensor placement [45]. Recently, MEMS acoustic sensors, which combine traditional acoustic sensing principles with MEMS technology, have been widely applied in microphones, hydrophones, biosensors, and many related devices. These sensors continue to advance toward greater miniaturization, higher performance, multimode operation, intelligent processing, improved integration, environmental adaptability, and enhanced mechanical flexibility, making them increasingly suitable for modern sensing applications [39,46]. MEMS acoustic sensors typically employ capacitive, piezoresistive [47], or piezoelectric [48] transduction mechanisms. MEMS technology enables the fabrication of highly sensitive, low-power acoustic transducers with small form factors, which support a wide range of applications in industrial monitoring, robotics, consumer electronics, and IoT devices [47,49,50]. With integrated signal-processing capabilities, MEMS microphones can perform real-time sound or vibration detection, frequency analysis, and noise suppression. However, capacitive microphones, although capable of producing high signal output, require additional power and are sensitive to dust, humidity, and electromagnetic interference, which makes them vulnerable in factory environments. Acoustic-emission-based methods depend on spontaneous sound-generating events, which limits their applicability to systems that consistently produce detectable acoustic emissions [41]. In addition, MEMS acoustic sensors rely on pressure-gradient mechanisms, so their output is inherently direction-dependent and sensitive to the relative position of the sound source [51]. Justin et al. [52] conducted a detailed analysis of a multi-resonant directional sensing system, demonstrating that MEMS acoustic sensors can be effectively scaled to operate across different frequency ranges of individual wings. However, in real operating conditions, multiple components of a robot often vibrate at the same frequency, complicating accurate vibration source identification. Therefore, in smart factory environments, where multiple robots operate simultaneously within the same workspace, strong background noise, acoustic interference, and overlapping vibration signatures further challenge the isolation of individual vibration sources using acoustic sensors alone. Recent advancements in HFR video-based vibration monitoring enable modal analysis and structural damage detection in robots. A key advantage of DIC is its ability to capture the responses of all points within the measurement region simultaneously and under identical excitation conditions [53]. By integrating HFR cameras with advanced DIC algorithms, the proposed method enables non-contact, multi-point, full-field vibration measurement of robot manipulators, providing detailed vibration information without the need for physical sensors. Compared with traditional sensors, the HFR video-based approach allows simultaneous observation of multiple robot components and provides richer spatial information for evaluating dynamic behavior, thereby overcoming the inherent limitations of conventional sensor-based measurement techniques.

2.2. HFR-Video-Based Vibration Monitoring Analysis

Many studies on robot manipulator vibrations have used standard vision systems operating at 30 frames per second, which cannot capture rapid, audio-frequency vibrations. The development of HFR cameras capable of recording hundreds of thousands of frames per second enables detailed vibration analysis. With multi-point, sub-pixel measurement capabilities for full-field dynamic analysis, these cameras allow non-contact vibration monitoring without the need for multiple sensors. This approach enables measurement of vibrations at audio-frequency levels across multiple points [24,54] and can localize vibration sources more accurately than acoustic source localization methods [55]. Fadi et al. [56] applied time-series analytics for defect detection. However, the initial three- and six-hour sensor data collections were insufficient, necessitating an additional 30-h run to obtain a meaningful datasets. Similarly, a laser vibrometer requires repeated measurements at each point, making the acquisition process time-consuming and often taking several hours to achieve comparable spatial and temporal resolution [57]. The most common method for diagnosing shaft-related problems is to place an accelerometer on the bearing housing. However, due to attenuation and distortion caused by the bearing structure, capturing the shaft’s true dynamic behavior is challenging [58]. Compared with traditional sensor-based methods, HFR cameras enable faster and more efficient data acquisition. High-speed cameras can capture image sequences during shaft rotation, providing valuable insights for machinery monitoring and fault diagnosis [59,60]. The measurement accuracies of both DIC and LDV techniques are detailed in [61,62]. The accuracy of the proposed DIC-based vibration measurement method was validated with reference to Wenxiang et al. [63], who reported vibration amplitudes ranging from 0.015 mm to 0.263 mm, with a precision of 3–5 μm at the identified peak frequencies. This demonstrates that DIC can achieve micrometer-level accuracy comparable to that of laser displacement sensors, even under high-vibration conditions. These findings support the reliability of the present study, in which the HFR camera operated at 1000 fps with a spatial resolution of 0.6 mm/pixel. The vibration measurement methods discussed in this study are summarized in Table 1, which compares their key characteristics and limitations.
Cheng et al. [64] investigated vibration recognition in buses, construction sites, subways, and high-speed rail systems, where data were collected via sensors and the target objects acted as vibration sources with distinct intensity levels. Li et al. [23], Shimasaki et al. [65], and Jiang et al. [66] utilized HFR cameras to localize vibration sources through digital filtering techniques, which rely on identifying unique vibration signatures associated with individual components. These methods assume that vibrations at each measurement point exhibit distinct characteristics, enabling effective source separation and localization. In contrast, analyzing vibrations across multiple parts of a robot poses additional challenges. Components are typically excited simultaneously with similar inputs, resulting in highly similar vibration patterns across different parts. Consequently, isolating the vibrations of individual sources using conventional filtering techniques becomes significantly challenging. Robot manipulators can be classified into two categories based on mobility: mobile robots and stationary robots. For the vibration analysis of a mobile robot in our previous study [67], the vibration machine was installed beneath the robot and anchored to the ground. In contrast, for the vibration analysis of a stationary robot in the current study, the vibration machine was mounted on the robot’s end-effector, while the robot itself was securely anchored to the ground. Even when considered independently, variations in friction and stiffness produced noticeable differences in the vibration responses of individual robot components. However, the overall response patterns remained highly similar, making it difficult to distinguish the contributions of specific sources using conventional filtering methods.

2.3. Health Diagnosis of Industrial Robots

Health monitoring of industrial robots is essential for predictive maintenance and reliable operation in smart manufacturing. Prolonged operation under dynamic loads can cause wear, increased vibrations, and joint loosening, leading to misalignments, positioning errors, or system failure. In highly automated facilities, even a single malfunctioning robot can halt entire assembly lines, causing substantial financial losses. Moreover, degraded robot performance poses significant safety risks, particularly in collaborative work spaces where humans and robots operate in close proximity. Effective health diagnosis of operating robot manipulators requires minimizing computational complexity and eliminating unstable data [68]. Optimizing robot postures is critical for smooth and efficient operation, as a robot’s performance, stiffness, and dynamic behavior depend on its position and movement [69,70]. Variations in link lengths, joint angles, and orientation can further compromise positioning accuracy [71], making it essential to define and maintain optimal postures. Proper posture not only minimizes mechanical failures but also ensures the collection of consistent and reliable measurement data [72]. Stable and accurate data are essential for effective robot health diagnosis. By standardizing postures, noise and disturbances are minimized, enabling consistent measurements and facilitating early fault detection.

3. Vibration Test of a Robot Manipulator with HFR-Video-Based Analysis

3.1. System Configuration

In this study, a six-degree-of-freedom vertical articulated robot (MELFA RV-4F-D, Mitsubishi Electric), coated with a random speckle pattern, was used as the measurement target. A compact vibration exciter (S-0105, Asahi Seisakusyo; mass: 400 g) was mounted on the robot’s end-effector, resulting in a total system height of 101 cm. Image acquisition was performed using a high-speed camera (EoSens 2.0CXP2, MIKROTRON) with a resolution of 1920 × 1080 pixels and a 20 mm focal-length lens, positioned 1.2 m from the robot. This setup provided a spatial resolution of 0.6 mm/pixel at a frame rate of 1000 fps. STFT-based vibration visualization was implemented on a GPU-accelerated high-speed vision platform. The algorithm was optimized using an NVIDIA GeForce RTX 2080 GPU and developed in C++ with Microsoft Visual Studio Community 2017. A GPU implementation of the phase-only correlation method, combined with an STFT-based vibration analysis algorithm, enabled simultaneous displacement estimation across all measurement points, following the approach in [24]. This configuration achieved real-time global displacement estimation and frequency analysis at frame rates up to 1000 fps for Full HD images. For accurate measurement, each image was partitioned into 128 × 128-pixel regions of interest (ROIs) with a 64-pixel stride, yielding 435 sub-images that served as virtual vibration sensing regions distributed across the robot and background. Under this setup, displacement estimation for all 435 ROIs required 2.67 ms per frame, demonstrating the capability for high-speed, full-field vibration monitoring. To ensure measurement reproducibility and minimize the influence of environmental factors, all experiments were conducted under strictly controlled conditions. The laboratory environment was maintained at 26 ± 1 °C under stable ambient lighting conditions, as shown in Figure 1.
The specifications of the PC used for the vibration controller are as follows: Processor: Intel(R) Core(TM) i7-8750 CPU @ 2.20 GHz, Memory: 16 GB, OS: Windows 11 Home. The specifications of the PCs used for shooting and analysis are as follows: Processor: Intel(R) Core(TM) i9-12900K 3.19 GHz, Memory: 64.0 GB, OS: Windows 11 Pro.

3.2. Posture Definition of a Robot Based on Joint Angles

In real-world working environments, the stiffness of a robot significantly affects the quality of machining during operation [73,74]. It is primarily influenced by the robot’s posture [75]. In practical industrial applications, a robot manipulator can assume an infinite number of postures depending on its assigned task and workspace configuration. The variations in posture significantly influence the robot’s dynamic behavior. Among the key contributing factors, posture plays a crucial role in determining the overall stiffness of the robotic system [69]. Vibrations can adversely affect both trajectory tracking accuracy and system stability. Reduced tracking precision leads to positioning errors, while compromised stability may cause excessive stress on actuators and structural components, thereby increasing the risk of mechanical failure and posing safety hazards during operation [76]. Therefore, characterizing the robot’s dynamic behavior is essential for analyzing its vibration response and developing effective control strategies [77]. The operation of a robot in different directions can significantly influence its overall dynamic behavior [78]. Since a robot’s stiffness varies with its posture within the workspace, it is possible to define performance indices to evaluate stiffness characteristics [69].
Accurate vibration analysis requires stable and representative data throughout the operation. To ensure reliable and consistent measurements, three representative postures were selected based on the relative joint angles between adjacent links and the robot’s actual operating cycle, as shown in Figure 2. These postures correspond to key stages of the task process: (1) the initial home position before motion, (2) an intermediate position where the arm extends and experiences maximum load and joint motion, and (3) the final position where the arm returns to the home configuration. These postures capture the full range of dynamic configurations and joint interactions during a complete cycle. The corresponding joint angles for each posture are as follows:
  • Posture 1: α = 0 , β = 0 , γ = 0
    The robot’s home position, with the vibration testing device aligned with the robot.
  • Posture 2: α = 45 , β = 90 , γ = 45
    A waiting position used before executing the next movement.
  • Posture 3: α = 45 , β = 0 , γ = 45
    The final position.
Reducing the external disturbances and ensure reliable data collection, the robot’s base was securely anchored to the ground, and the robot was fixed in each of the three defined postures with all component velocities assumed to be zero. This setup simplified the dynamic conditions and allowed for consistent, repeatable vibration measurements. A vibration testing machine (S-0105, Asahi Seisakusho) was mounted on the robot’s end-effector, as shown in Figure 1, generating horizontal vibrations through a sinusoidal input with an acceleration amplitude of 0.098 m/s2. Although this amplitude is relatively small, it was sufficient to excite the system without inducing unintended motion. In the experiment, a frequency sweep from 1 Hz to 200 Hz was applied, increasing by 1 Hz per second. To ensure stable vibration data and eliminate the influence of control feedback, the robot was maintained in a servo-off state to isolate its structural vibration characteristics. The defined postures served as reference conditions for consistent vibration data collection and analysis across different operational phases. Despite the inherent resistance and friction of individual robot components, the system successfully captured full-field vibrations, demonstrating the capability of the HFR camera-based measurement to accurately quantify the robot’s dynamic response.

3.3. Implement Algorithm

3.3.1. Input Image

When an HFR camera records a video, it is converted into a sequence of images according to the frame rate. Let the camera frame rate be denoted by F (frames per second). The recorded video V can then be represented as a sequence of N input images:
V = { I 1 ( x , y , t 1 ) , I 2 ( x , y , t 2 ) , , I N ( x , y , t N ) }
where I k ( x , y , t k ) denotes the image intensity at pixel coordinates ( x , y ) in the k-th frame, captured at time t k . The sampling interval between consecutive frames is constant and given by Δ t = 1 / F , where F is the camera frame rate.

3.3.2. Root Mean Square Error for Ground-Truth Validation

The Root Mean Square Error (RMSE) is commonly used to quantify the difference between a measured signal and a reference (ground truth) signal. The accuracy of the DIC-based displacement measurements was quantified by computing the RMSE between the reference points on the fixed background wall and the robot base_top. The formula for RMSE is given by:
RMSE = 1 n i = 1 n ( W i R i ) 2
where: W i is the wall (reference) value, R i is the measured value at the robot base_top, n is the number of observations. This metric provides a quantitative measure of the tracking error, allowing evaluation of the stability and reliability of the DIC measurements. In this study, the wall was treated as the ground-truth signal owing to its structural stability and negligible vibrations, while the robot base was considered the measured signal. The deviation between these two signals is analyzed to assess the robot system’s dynamic response and ensure its operational reliability.

3.3.3. Measurement Points on Robot Components

We mounted the vibration machine on the robot’s end-effector and recorded vibration data for three distinct postures, as shown in Figure 3. By applying mask processing to these sub-images, eight key points representing critical robot components were selected for multi-point DIC displacement measurements, enabling an effective assessment of vibration distribution across the entire manipulator. The measurement points were located at the end-effector, robot-top, arm-top, arm-bottom, joint1-top, joint1-bottom, joint2-top, joint2-bottom, and the base-top of the robot. These points were chosen to capture the dynamic behavior of all major structural elements, including links, joints, and the base. During the vibration test, the robot base was securely anchored to the ground to prevent any movement and subjected to horizontal excitation. The displacement responses in the X direction at all measurement points closely matched the shaker’s input frequency. In contrast, displacements in the Y direction were negligible and could not be accurately captured. Overall, the robot’s surface and shape remained effectively unchanged before and after the test. The HFR camera recorded the vibration, and the video was converted into individual frames based on the frame rate. For each posture, the first frame was used as the reference image, and all subsequent frames served as input images to capture the displacement of each robot component.
The robot parts, N measurement points were previously given as x i = ( x i , y i ) ( i = 0 , , N 1 ) in the image. Computing the sub-image correlation around x i between the input image I ( x , y , t ) at time t and the reference image I R ( x , y ) , the displacement vector at the i-th measurement point, d i ( t ) = ( d x i ( t ) , d y i ( t ) ) , is estimated by the following DIC function:
d i ( t ) = DIC ( I ( x , y , t ) , I R ( x , y ) ; x i ) ( i = 0 , , N 1 )
where the reference image was set to the input one at the start time for the HFR video at each vibration frequency. In the analysis, nine points were designated for DIC displacement measurement, as illustrated in Figure 3. Corresponding to the input excitation (with the shaker mounted on the end-effector), eight output measurement points (robot components) were located: robot-top, arm-top, arm-bottom, joint1-top, joint1-bottom, joint2-top, joint2-bottom, and base-top of the robot.

3.3.4. STFT-Based Frequency Response Analysis

A transfer function represents the mathematical relationship between input and output signals, specifically the movement of various robot parts resulting from input-forcing actions. By executing short-time Fourier transforms (STFTs) of the displacement signals with K frames at the measurement points, d ξ i ( t + k τ ) ( k = 0 , , K 1 ; ξ = ( x , y ) ) , their frequency responses were computed as follows:
F i ( j ω ; t ) = STFT d ξ i ( t ) , , d ξ i ( t ( K 1 ) τ )
where τ is the frame cycle time of the HFR video, and K determines the frequency resolution in STFT. In the test, the input in the frequency domain corresponds to the frequency response at the Shaker point as F 0 ( j ω ; t ) , by comparing the frequency responses at the points on the robot, the following transfer function at the z-th point ( z = 1 , , N 1 ) is computed to identify how the vibration propagates through the robot. G (gain) is the output/input ratio of the frequency response (F) at the excitation frequency ω .
G z ( j ω ; t ) = F z ( j ω ; t ) F 0 ( j ω ; t ) ( z = 1 , , N 1 )
This approach enables the simultaneous estimation of transfer functions at multiple points and the identification of robot parts that exhibit significant vibrations at resonant frequencies through frequency–domain analysis.

4. Experimental Results

4.1. Ground-Truth Measurement

Ground-truth reference points are commonly used in vision-based vibration measurement to validate the accuracy and stability of extracted displacement signals [79,80]. Traditionally, accelerometers serve as ground-truth reference sensors in vibration analyses. However, mounting them at all relevant locations on the robot surface can significantly alter the system’s dynamic behavior. To avoid mass-loading effects and ensure the integrity of the vibration signal, this study employed a fully non-contact ground-truth strategy. A HFR camera simultaneously captured both the robot and the fixed background wall within the same image frame. Rigid points on the background wall, alongside the robot base firmly anchored to the ground, served as reference points to quantify tracking noise. To obtain a reliable ground-truth measurement, the analysis was performed using Posture 1 of the robot, where the system exhibited the most stable behavior, as shown in Figure 4. The vibration signal at the wall (yellow line) showed a waveform similar to that at the robot base_top (green line), and the displacement signal obtained from the wall point closely matched both the waveform and the frequency characteristics of the applied excitation. The RMSE between the wall and the robot base_top was 0.82%. Because DIC determines displacement by tracking pixel-intensity patterns across frames, even minor robot-induced camera motion can generate apparent global shifts in the image. Despite this, the results confirm that the proposed single-sequence, fully non-contact ground-truth strategy yields sufficiently accurate and stable reference data for validating robot vibration responses.

4.2. Real-Time Vibration Visualization

In real operations, industrial robots experience changing dynamic properties such as mass, damping, stiffness, and gravity effects depending on joint configurations. These variations intensify during high-speed motion due to complex dynamics and inertia. Previous studies have mainly used sensor-based techniques for real-time vibration monitoring. Unlike traditional sensors that capture vibrations only at specific points, DIC enables full-field measurement of strain and displacement across the entire surface of the robot [81]. The frequency responses of various robot components were visualized using color maps, enabling detailed spatial analysis of vibration characteristics across the entire structure. DIC calculations and subsequent mask processing were performed to generate color maps, as shown in Figure 5. These maps visualize vibration magnitudes, with red regions indicating higher displacements and blue regions indicating lower displacements. At low frequencies (around 17 Hz) vibrations, in Posture 1, tend to span the entire body; in Posture 2, they are concentrated in the middle part; and in Posture 3, they are focused on the upper part. At mid-range frequencies (around 46 Hz), the distribution also varies: in Posture 1, vibrations are spread across the upper part and base, whereas in Postures 2 and 3, they are mainly concentrated in the upper part. At higher frequencies (above 100 Hz), the vibration distribution becomes more uniform. The real-time visualization provided a comprehensive spatial overview of vibration behavior across the robot structure, revealing how vibration energy propagates and localizes under different configurations. This spatial analysis serves as the basis for the time–domain investigations presented in the following section, enabling a quantitative assessment of amplitude variations across multiple components. DIC provides displacement measurements for each captured frame. However, vibration analysis requires signals expressed as functions of time. Although DIC is inherently an image-based technique, the frames are recorded at a constant sampling rate, so each displacement measurement corresponds to a specific time instant. By sequencing these measurements, the displacement field at each tracking point can be converted into a time–domain signal suitable for vibration analysis. These time–domain signals can then be transformed into the frequency domain using techniques such as the STFT to extract vibration characteristics, resonant frequencies, and dynamic response information [24,53,67,82].

4.3. Time–Domain Analysis of Robot Vibrations

4.3.1. DIC Multi-Point Measurement

Building upon the full-field vibration visualization presented in Section 4.2, this section provides a quantitative time–domain analysis of the displacement signals at key measurement points. In the experiment, an HFR camera was positioned 1.2 m from the robot and equipped with a 20 mm focal length lens. The camera was operated with a 1000 µs exposure time. Validation of the DIC-based multi-point vibration measurements was conducted for three distinct robot postures, with input excitation of 17 Hz, 30 Hz, and 80 Hz applied via a shaker mounted on the robot’s end-effector. Eight measurement points were selected across key robot components: robot-top, arm-top, arm-bottom, joint1-top, joint1-bottom, joint2-top, joint2-bottom, and base-top. The results demonstrate the accuracy and reliability of the multi-point vibration data obtained through DIC, as shown in Figure 6. The HFR camera, operating at 1000 fps, provided a Nyquist frequency of 500 Hz, which is theoretically sufficient to capture vibration responses within the target frequency range. However, signals in the 1–15 Hz range were very small, likely due to the robot’s structural resistance and control constraints, making multi-point vibrations difficult to observe. Similarly, signals in the 125–200 Hz range were weak and overlapped with noise, making reliable measurements challenging. Therefore, the analysis focused on the 15–125 Hz range, where vibration amplitudes were sufficiently large to capture the dominant structural resonances.
Across all postures, the input force signal consistently exhibited higher amplitudes than the output vibration signals measured at various robot components, while the output signals preserved the same waveform characteristics as the input excitation. The power transmission efficiency varied with posture due to changes in joint configuration, leading to posture-dependent displacement differences. Nevertheless, the measured displacements closely followed the input force waveform, validating the reliability of the DIC system.

4.3.2. Time–Domain Analysis

Time–domain analysis is a powerful tool that provides detailed, dynamic insights into a robot’s condition, enabling the rapid detection of sudden changes in behavior. In this study, vibration visualization was derived from displacement data, providing a comprehensive representation of the robot’s dynamic response over time. These time–domain measurements form the basis for subsequent frequency–domain analysis, facilitating the identification of dominant vibration modes and the estimation of transfer functions for each component. Time–domain damage detection algorithms examine vibration responses over time to identify anomalies or structural changes [83], and are often employed to extract physical features indicative of bearing degradation [84]. In this study, time–domain vibration analysis was performed to evaluate the robot’s dynamic response under three representative postures. Displacement signals obtained from DIC were analyzed to capture time-dependent variations in the robot’s structural behavior. This approach provides insight into how vibrational energy propagates and dissipates throughout the structure and serves as a foundation for subsequent frequency–domain and modal analyses. To validate our data accuracy, we present the vibration analysis results in the horizontal directions, corresponding to the excitation direction. The time–domain responses of a healthy robot to frequencies ranging from 15 Hz to 125 Hz are shown in Figure 6.
Posture 1: At 17 Hz, the displacement responses at all points are closely aligned with the vibration frequency of the shaker (input excitation). At lower frequencies, the signal shapes suggest reduced frictional effects. The amplitudes at the shaker, arm-top, joint 1-top, and joint 2-top were 0.305, 0.082, 0.039, and 0.037 mm, respectively. At 30 Hz, the displacement signals of the robot parts became clearer. The amplitudes at the shaker, arm-top, joint 1-top, and joint 2-top were 0.297, 0.015, 0.010, and 0.008 mm. At 80 Hz, the amplitudes became smaller, but the signal shape remained similar to the input force. The amplitudes at the shaker, arm-top, joint 1-top, and joint 2-top were 0.311, 0.016, 0.018, and 0.010 mm, respectively. In the x-direction, the amplitude decreased as the excitation frequency increased, particularly between 25 and 44 Hz.
Posture 2: At 17 Hz, the displacement responses at all points are closely aligned with the vibration frequency of the shaker (input excitation). At lower frequencies, the signal shapes suggest more frictional effects compared to Posture 1 at the same frequency. The amplitudes at the shaker, arm-top, joint 1-top, and joint 2-top were 0.044, 0.020, 0.013, and 0.009 mm, respectively. At 30 Hz, the displacement signals of the robot parts became clearer, with less friction than at 17 Hz. The amplitudes at the shaker, arm-top, joint 1-top, and joint 2-top were 0.056, 0.010, 0.008, and 0.005 mm. At 80 Hz, the amplitudes became very small, and the signal exhibited more frictional effects. The amplitudes at the shaker, arm-top, joint 1-top, and joint 2-top were 0.066, 0.018, 0.007, and 0.004 mm, respectively.
Posture 3: At 17 Hz, the displacement responses at all points are closely aligned with the vibration frequency of the shaker (input excitation). At lower frequencies, the signal shapes suggest reduced frictional effects. The amplitudes at the shaker, arm-top, joint 1-top, and joint 2-top were 0.055, 0.038, 0.017, and 0.012 mm, respectively. At 30 Hz, the amplitudes at the shaker, arm-top, joint 1-top, and joint 2-top were 0.023, 0.010, 0.010, and 0.005 mm. At 80 Hz, the amplitudes became very small, and the signal exhibited more frictional effects. The amplitudes at the shaker, arm-top, joint 1-top, and joint 2-top were 0.038, 0.018, 0.021, and 0.020 mm, respectively.
At 17 Hz, the displacement responses in all three postures were synchronized with the shaker’s input frequency, confirming vibration transmission through the robot’s structure. However, the amplitudes differed notably across postures. Posture 1 exhibited the highest amplitudes (0.305 mm at the shaker, 0.082 mm at the arm-top, 0.039 mm at joint 1-top, and 0.037 mm at joint 2-top). In contrast, Posture 2 showed a substantial reduction (0.044, 0.020, 0.013, and 0.009 mm, respectively), while Posture 3 demonstrated intermediate values (0.055, 0.038, 0.017, and 0.012 mm). Consequently, in Posture 1, the vibration signals from the robot components closely resembled the input force signal in both shape and frequency content, with relatively less noise. This suggests that the structural response under Posture 1 preserved the characteristics of the input excitation more effectively than the other postures. In Posture 2, at low frequencies such as 17 Hz, the vibration signals also resembled the input force in shape and frequency content but contained more noise. At high frequencies like 80 Hz, the signals showed less noise. In Posture 3, at low frequencies such as 17 Hz, the vibration signals closely resembled the input signal and showed relatively less noise. However, at high frequencies such as 80 Hz, more noise was observed.

4.4. Frequency–Domain Spectrum Analysis

Frequency–domain analysis evaluates a system’s dynamic behavior by examining its frequency response. By linking the visualized time–domain motion to frequency–domain characteristics, this analysis provides a comprehensive understanding of the robot’s dynamic behavior and vibration propagation. Transfer functions at the measurement points were obtained by performing STFT computations on 1024 frames for the horizontal displacements estimated at all points. The power spectrograms of the output/input ratios in the x directions at the joint1-top, as shown in Figure 7, Figure 8 and Figure 9. These points were analyzed in a 200-s full-HD video captured at 1000 fps, with measurements taken at 1-s intervals.
Posture 1: In the Posture 1 x direction spectrograms, as shown in Figure 7, large responses were observed around t = 15–25 s, t =45–50 s, t = 72–110 s, and t = 125–145 s when the excited vibration frequencies were 15–25 Hz, 45–50 Hz, 60–65 Hz, 72–110 Hz, and 125–145 Hz, respectively.
Posture 2: In the x direction spectrograms, as shown in Figure 8, large responses were observed around t = 15–25 s, t =45–52 s, and t = 120–145 s when the excited vibration frequencies were 15–25 Hz, 45–52 Hz, and 120–145 Hz, respectively.
Posture 3: In the x direction spectrograms, as shown in Figure 9, large responses were observed around t = 15–27 s, t =57–62 s, and t = 65–100 s when the excited vibration frequencies were 15–27 Hz, 57–62 Hz, and 65–100 Hz, respectively.
The results from the three postures are shown in Figure 10, Figure 11 and Figure 12. When sweeping excitation frequencies from 1 Hz to 200 Hz, we observed that signal detection in the low-frequency range (1–15 Hz) was difficult. This was primarily due to the structural stiffness and internal resistance of the robot, which dampened minor displacements, resulting in signals too weak to be reliably captured by the DIC system. At the upper end of the frequency range (125–200 Hz), accurate vibration measurement became increasingly challenging due to mechanical backlash and internal joint compliance. These factors introduced nonlinearities and phase inconsistencies, which distorted the displacement signals and led to irregularities in the frequency response function.

4.5. Definition and Identification of Transfer Functions

4.5.1. Definition of Transfer Functions

The transfer function defines the dynamic relationship between input and output signals in the frequency domain, representing the system’s inherent characteristics. In this study, transfer functions were estimated from DIC-based displacement measurements to characterize the robot’s dynamic behavior under controlled excitation. Specifically, the transfer function, defined as the ratio of output displacement of the robot components to the input excitation from the shaker, was computed for all components, enabling full-field vibration analysis during operation. The resulting frequency response, derived from these transfer functions, describes how the output amplitude varies as a function of the applied excitation frequency. A sine-sweep input ranging from 1 to 200 Hz, increasing at 1 Hz/s, was applied to the robot manipulator, and vibration responses were recorded at nine measurement points. At each frequency step, only the displacement amplitude corresponding to the instantaneous excitation frequency was used to minimize frequency-transition effects. Consequently, the obtained frequency responses reflect how individual robot components react to the applied excitation across the frequency range, providing insight into their dynamic characteristics, as shown in Figure 10, Figure 11 and Figure 12, where each component’s response is compared relative to the shaker input.

4.5.2. Transfer Function Identification for Posture 1

The transfer function exhibits prominent responses at 19, 21, 47, 63, and 98 Hz, as shown in Figure 10. Notably, significant peaks are observed at 19 Hz and 21 Hz. This suggests that resonance begins around 19 Hz, with a peak at 21 Hz, indicating the presence of a natural frequency in this range. The amplified response at 19, 21, 47, and 63 Hz implies that the robot is particularly sensitive to excitation’s near this frequency, which could result in increased vibrations and potential instability if not properly managed.

4.5.3. Transfer Function Identification for Posture 2

The transfer function shows significant peaks at 15 Hz, 43 Hz, and 118 Hz, as shown in Figure 11. These frequencies correspond to the system’s dynamic response under external excitation and indicate potential resonance points. The peak at 15 Hz may represent a low-frequency mode associated with the robot’s overall structure or base movement. In contrast, the peaks at 43 Hz and 118 Hz likely correspond to higher-order vibration modes, possibly due to joint flexibility or local structural resonances.
These resonant frequencies suggest that the robot may experience amplified vibrations when excited near these frequencies, potentially affecting positioning accuracy and long-term mechanical integrity if not properly mitigated.

4.5.4. Transfer Function Identification for Posture 3

Transfer function observed at 19 Hz, 21 Hz, 59 Hz, 69 Hz are shown in Figure 12. The responses at 19 Hz and 21 Hz again indicate the onset and peak of a resonant mode, suggesting a natural frequency in this range. The additional peaks at 59 Hz and 69 Hz highlight another resonant region, likely associated with higher-order structural or joint dynamics. Based on the results from Postures 1 and 2, the transfer function was observed despite the robot’s resistance. However, in Posture 3, increased friction was present compared to Postures 1 and 2. Therefore, for long-term robot operation, Postures 1 and 2 are preferable due to their lower friction and superior structural stiffness. This distribution is essential for capturing a comprehensive understanding of how different sections of the robot respond to a common excitation source.
Figure 10. Transfer Function Identification of Posture 1.
Figure 10. Transfer Function Identification of Posture 1.
Applsci 15 12771 g010
Figure 11. Transfer Function Identification of Posture 2.
Figure 11. Transfer Function Identification of Posture 2.
Applsci 15 12771 g011
Figure 12. Transfer Function Identification of Posture 3.
Figure 12. Transfer Function Identification of Posture 3.
Applsci 15 12771 g012

4.5.5. Part-by-Part Analysis of the Transfer Function

Analyzing vibration data from various points on the robot enables detection of localized dynamic behavior differences among its components. As discussed in the manuscript, these variations arise due to differences in structural stiffness, mass distribution, and mechanical coupling at each location. Consequently, each part of the robot exhibits a distinct vibration response, highlighting the importance of multi-point measurement in accurately characterizing the overall dynamic performance of the robotic system. This variability, as further visualized in Figure 13 and Figure 14, underscores the importance of multi-point measurement for accurately capturing the robot’s overall dynamic behavior. Resonant frequencies are sensitive indicators of a robot’s dynamic state, and changes in these frequencies can signal structural issues such as loose joints, fatigue, or wear. Operating near resonant frequencies amplifies dynamic responses, increasing the risk of damage and component failure. In particular, resonant frequencies can intensify the effects of unbalanced forces, leading to excessive vibrations [85,86]. To minimize joint wear, the optimal robot placement is determined by evaluating all possible positions within the workspace [87]. Therefore, accurately identifying and avoiding resonant frequencies is essential for ensuring long-term reliability and structural integrity [86,88], and extending the equipment’s operational lifetime. These findings highlight the importance of posture-dependent dynamics and structural interactions in robotic vibration analysis.

5. Discussion and Limitations

5.1. Discussion of Results

In smart factory environments, monitoring the condition of industrial robots is essential for the early detection of abnormal vibrations to prevent accidents, ensure worker safety, and avoid costly breakdowns. To address this need, this study identifies the transfer function of the robot structure under different configurations. The results demonstrate that the resonant frequencies vary across postures, indicating that the robot’s dynamic response is strongly posture dependent. In Posture 1, resonances were observed at 19, 21, 47, and 63 Hz. In Posture 2, resonances occurred at 15, 43, and 118 Hz, while in Posture 3, they appeared at 19, 21, 59, and 69 Hz. Based on these measurements, the first natural frequency of the robot structure was 19 Hz in Posture 1, 15 Hz in Posture 2, and 19 Hz in Posture 3.
Using accelerometer sensors, Jiabin et al. [89] analyzed vibrations of the ABB IRB6400 robot and identified dominant peaks at approximately 16 Hz (P2) and 15 Hz (P3), corresponding to their predicted natural frequencies. Similarly, Ryuta et al. [28] investigated another six-axis robot and reported a resonance frequency of around 15 Hz, associated with translational and rotational vibrations between the J1 and J2 links. The differences in frequencies are likely due to differences in boundary conditions, robot configurations, and joint stiffness, which are known to influence the dynamic characteristics of multi-link manipulators. Part-by-part analysis of the transfer function revealed that each robot component exhibits a distinct vibration response, underscoring the importance of multi-point measurement for accurately characterizing the overall dynamic performance of the robotic system. Operating near resonant frequencies amplifies vibrations, accelerating fatigue, wear, and the risk of component failure. Accurate identification and avoidance of these frequencies are therefore essential to ensure long-term reliability and extend equipment lifespan. Moreover, by continuously monitoring changes in resonant frequencies over time, the proposed system enables structural health monitoring and early detection of mechanical degradation. Integrating this vibration-based monitoring framework into predictive maintenance strategies facilitates continuous assessment of robot health, promoting proactive fault prevention and enhancing operational efficiency in smart manufacturing environments.

5.2. Limitations of the Present Study

This study has several limitations that should be acknowledged. First, the research provides a foundational validation of the proposed vibration monitoring approach, representing an initial step toward future smart factory applications. Second, all vibration measurements were conducted under stationary conditions with the robot in a servo-off state to eliminate control feedback effects; as a result, the data may not fully reflect the dynamic conditions of actual operation. Additionally, only three representative postures were analyzed, limiting the ability to capture the full range of dynamic behaviors across the robot’s workspace. The findings are based on a single measurement sequence for each posture. While ground-truth validation confirmed the accuracy of the measurement chain, the lack of statistical repetition limits the generalizability of the absolute amplitude values. The identified resonant frequencies, however, are considered reliable, as they reflect fundamental structural properties. The method is also sensitive to optical factors, including illumination uniformity, focus stability, and speckle pattern quality. Variations in lighting, focus deviations, or surface contamination can reduce image contrast, impair tracking reliability, and degrade DIC accuracy. Finally, data collection in this study was limited to three postures during the robot’s waiting periods and was analyzed only in the X and Y directions due to the use of a single HFR camera, which constrained Z-axis analysis.

6. Conclusions and Future Work

This study demonstrates that a HFR video camera combined with DIC enables non-contact, multi-point, and full-field vibration measurement of industrial robot manipulators. By analyzing three representative postures defined by joint angles, posture-dependent vibration transmission was quantitatively characterized, revealing distinct dynamic behaviors across configurations. A real-time vibration visualization system was developed to validate the feasibility and practical applicability of the proposed method. The vibration visualization, derived from displacement data, provided a comprehensive representation of the robot’s dynamic response over time. These time–domain measurements were subsequently transformed into the frequency domain to identify dominant vibration modes and estimate transfer functions for each component. The results confirm that robot posture strongly influences vibration transmission characteristics, underscoring the importance of posture-aware vibration evaluation for reliable and precise robot operation. Overall, the findings highlight that integrating multi-point vibration measurements with transfer function analysis offers a comprehensive understanding of the dynamic behavior of industrial robots, supporting their stable, safe, and efficient operation in smart manufacturing environments. Furthermore, this study presents a novel non-contact, full-field vibration monitoring approach capable of simultaneously capturing multi-point vibrations and posture-dependent dynamic responses. Such comprehensive characterization of robot dynamics is difficult to achieve with conventional sensor-based methods, which are often limited by the number of measurement points, fixed sensor positions, and complex installation requirements.
Future work will focus on enhancing the robustness and practical applicability of the proposed approach. Optical limitations such as illumination non-uniformity, defocusing, and speckle-pattern degradation will be further investigated to improve measurement reliability under real-world conditions. The method will also be extended to real-time vibration monitoring during active robot motion, enabling more accurate characterization of operational dynamics. To capture a broader range of posture-dependent behaviors, future experiments will incorporate additional robot configurations and employ multi-camera stereo imaging to enable full 3D vibration analysis, long-term monitoring, endurance testing, and detailed evaluation of vibration characteristics across various robot states. Experiments will be repeated multiple times to calculate average values, standard deviations, and standard errors, ensuring robust statistical evaluation of trends and minimizing the influence of random fluctuations. In addition, comparative analyses with experimental modal analysis (EMA) and finite element method (FEM) simulations will be performed to validate and correlate the vibration modes identified through DIC-based measurements. Simultaneous measurements using DIC and accelerometers will also be conducted to compare transfer functions and verify the consistency between optical and sensor-based methods. Building on this framework, future research aims to develop a dynamic digital twin of industrial robots by mapping real-time experimental data onto physics-based models. Furthermore, multi-sensor fusion, integrating DIC-based displacement measurements with data from metrologically certified accelerometers placed at key locations, represents a promising approach for enhancing measurement accuracy and providing the traceability required for industrial adoption and certification. By leveraging the high dynamic sampling rate of contact accelerometers, such a data-fusion strategy can combine acceleration measurements with non-contact displacement estimations from high-speed video, enabling accurate identification of critical dynamic deformation states in robotic structures [90]. This integration not only improves the reliability and precision of vibration monitoring and further enhances the method’s suitability for industrial applications. Overcoming these challenges will advance predictive maintenance, scalable monitoring, and the overall safety, efficiency, and reliability of smart manufacturing environments.

Author Contributions

Conceptualization, I.I.; methodology, T.A. and K.S.; software, T.A. and F.W.; validation, T.A., F.W., K.S. and I.I.; formal analysis, T.A., F.W., K.S. and I.I.; investigation, T.A., H.O., K.S. and I.I.; resources, H.O., K.S. and I.I.; data curation, H.O., T.A. and K.S.; writing—original draft preparation, T.A.; writing—review and editing, T.A., F.W., K.S. and I.I.; visualization, T.A. and K.S.; supervision, I.I.; project administration, I.I. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by JST SPRING, Grant Number JPMJSP2132.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Tran, C.C.; Lin, C.Y. An Intelligent Path Planning of Welding Robot Based on Multisensor Interaction. IEEE Sens. J. 2023, 23, 8591–8604. [Google Scholar] [CrossRef]
  2. Yan, B.; Fan, P.; Lei, X.; Liu, Z.; Yang, F. A Real-Time Apple Targets Detection Method for Picking Robot Based on Improved YOLOv5. Remote Sens. 2021, 13, 1619. [Google Scholar] [CrossRef]
  3. Liao, Z.Y.; Wang, Q.H.; Xie, H.L.; Li, J.R.; Zhou, X.F.; Pan, T.H. Optimization of Robot Posture and Workpiece Setup in Robotic Milling with Stiffness Threshold. IEEE/ASME Trans. Mechatron. 2022, 27, 582–593. [Google Scholar] [CrossRef]
  4. Min, Y.; Liu, X.; Hu, G.; Jin, G.; Ma, Y.; Bian, Y.; Xie, Y.; Hu, M.; Li, D. A Research Method to Investigate the Effect of Vibration Suppression on Thin-Walled Parts of Aluminum Alloy 6061 Based on Cutting Fluid Spraying (CFS). Machines 2025, 13, 594. [Google Scholar] [CrossRef]
  5. Xia, B.; Wang, K.; Xu, A.; Zeng, P.; Yang, N.; Li, B. Intelligent Fault Diagnosis for Bearings of Industrial Robot Joints Under Varying Working Conditions Based on Deep Adversarial Domain Adaptation. IEEE Trans. Instrum. Meas. 2022, 71, 3508313. [Google Scholar] [CrossRef]
  6. Cooley, C.G.; Parker, R.G. A Review of Planetary and Epicyclic Gear Dynamics and Vibrations Research. Appl. Mech. Rev. 2014, 66, 040804. [Google Scholar] [CrossRef]
  7. Sahu, G.N.; Otto, A.; Ihlenfeldt, S. Improving Robotic Milling Performance through Active Damping of Low-Frequency Structural Modes. J. Manuf. Mater. Process. 2024, 8, 160. [Google Scholar] [CrossRef]
  8. Qin, H.; Li, Y.; Xiong, X. Workpiece Pose Optimization for Milling with Flexible-Joint Robots to Improve Quasi-Static Performance. Appl. Sci. 2019, 9, 1044. [Google Scholar] [CrossRef]
  9. Li, M.; Qin, J.; Wang, Z.; Liu, Q.; Shi, Y.; Wang, Y. Optimal Motion Planning Under Dynamic Risk Region for Safe Human–Robot Cooperation. IEEE/ASME Trans. Mechatron. 2024, 30, 1050–1060. [Google Scholar] [CrossRef]
  10. Inam, R.; Raizer, K.; Hata, A.; Souza, R.; Forsman, E.; Cao, E.; Wang, S. Risk Assessment for Human-Robot Collaboration in an automated warehouse scenario. In Proceedings of the 2018 IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA), Turin, Italy, 4–7 September 2018; Volume 1, pp. 743–751. [Google Scholar] [CrossRef]
  11. Franklin, C.S.; Dominguez, E.G.; Fryman, J.D.; Lewandowski, M.L. Collaborative robotics: New era of human-robot cooperation in the workplace. J. Saf. Res. 2020, 74, 153–160. [Google Scholar] [CrossRef]
  12. Zhang, Y.; Wu, J.; Gao, B.; Xia, L.; Lu, C.; Wang, H.; Cao, G. Fault Types and Diagnostic Methods of Manipulator Robots: A Review. Sensors 2025, 25, 1716. [Google Scholar] [CrossRef]
  13. Ahmad, R.; Kamaruddin, S. An overview of time-based and condition-based maintenance in industrial application. Comput. Ind. Eng. 2012, 63, 135–149. [Google Scholar] [CrossRef]
  14. Nguyen, T.D.; Nguyen, T.H.; Do, D.T.B.; Pham, T.H.; Liang, J.W.; Nguyen, P.D. Efficient and Explainable Bearing Condition Monitoring with Decision Tree-Based Feature Learning. Machines 2025, 13, 467. [Google Scholar] [CrossRef]
  15. Hosseinzadeh, A.; Frank Chen, F.; Shahin, M.; Bouzary, H. A predictive maintenance approach in manufacturing systems via AI-based early failure detection. Manuf. Lett. 2023, 35, 1179–1186. [Google Scholar] [CrossRef]
  16. Raviola, A.; De Martin, A.; Guida, R.; Jacazio, G.; Mauro, S.; Sorli, M. Harmonic Drive Gear Failures in Industrial Robots Applications: An Overview. PHM Soc. Eur. Conf. 2021, 6, 11. [Google Scholar] [CrossRef]
  17. Vallachira, S.; Orkisz, M.; Norrlöf, M.; Butail, S. Data-Driven Gearbox Failure Detection in Industrial Robots. IEEE Trans. Ind. Inform. 2020, 16, 193–201. [Google Scholar] [CrossRef]
  18. Mohammed, A.; Ghaithan, A.; Al-Saleh, M.; Al-Ofi, K. Reliability-Based Preventive Maintenance Strategy of Truck Unloading Systems. Appl. Sci. 2020, 10, 6957. [Google Scholar] [CrossRef]
  19. Huang, X.; Zhang, J.; Cheng, M. Fault Detection of Servo Motor Bearing Based on Speed Signal Under Variable-Speed Conditions. IEEE Trans. Instrum. Meas. 2024, 73, 3517012. [Google Scholar] [CrossRef]
  20. Aggogeri, F.; Borboni, A.; Merlo, A.; Pellegrini, N.; Ricatto, R. Real-Time Performance of Mechatronic PZT Module Using Active Vibration Feedback Control. Sensors 2016, 16, 1577. [Google Scholar] [CrossRef]
  21. Li, J.; Liu, X.; Liu, F.; Xu, D.; Gu, Q.; Ishii, I. A Hardware-Oriented Algorithm for Ultra-High-Speed Object Detection. IEEE Sens. J. 2019, 19, 3818–3831. [Google Scholar] [CrossRef]
  22. Sharma, A.; Shimasaki, K.; Gu, Q.; Chen, J.; Aoyama, T.; Takaki, T.; Ishii, I.; Tamura, K.; Tajima, K. Super high-speed vision platform for processing 1024 × 1024 images in real time at 12500 fps. In Proceedings of the 2016 IEEE/SICE International Symposium on System Integration (SII), Sapporo, Japan, 13–15 December 2016; pp. 544–549. [Google Scholar] [CrossRef]
  23. Li, J.; Shimasaki, K.; Tuniyazi, A.; Ishii, I.; Ogihara, M.; Yoshiyama, M. HFR Video-Based Hornet Detection Approach Using Wing-Beat Frequency Analysis. In Proceedings of the 2023 IEEE SENSORS, Vienna, Austria, 29 October–1 November 2023; pp. 1–4. [Google Scholar] [CrossRef]
  24. Wang, F.; Hu, S.; Shimasaki, K.; Ishii, I. Real-Time Vibration Visualization Using GPU-Based High-Speed Vision. J. Robot. Mechatron. 2022, 34, 1011–1023. [Google Scholar] [CrossRef]
  25. Macedo, L.; Louzada, P.; Gustavo Villani, L.; Frizera, A.; Marques, C.; Leal-Junior, A. Machinery Fault Diagnosis in Electric Motors Through Mechanical Vibration Monitoring Using Fiber Bragg Grating-Based Accelerometers. IEEE Sens. J. 2024, 24, 20655–20662. [Google Scholar] [CrossRef]
  26. Wang, T.; Han, Q.; Chu, F.; Feng, Z. Vibration based condition monitoring and fault diagnosis of wind turbine planetary gearbox: A review. Mech. Syst. Signal Process. 2019, 126, 662–685. [Google Scholar] [CrossRef]
  27. Wang, H.; Xie, J. Fault Diagnosis of Rolling Bearings Based on Acoustic Signals in Strong Noise Environments. Appl. Sci. 2025, 15, 1389. [Google Scholar] [CrossRef]
  28. Sato, R.; Ito, Y.; Mizuura, S.; Shirase, K. Vibration Mode and Motion Trajectory Simulations of an Articulated Robot by a Dynamic Model Considering Joint Bearing Stiffness. Int. J. Autom. Technol. 2021, 15, 631–640. [Google Scholar] [CrossRef]
  29. DiGiampaolo, E.; DiCarlofelice, A.; Gregori, A. An RFID-Enabled Wireless Strain Gauge Sensor for Static and Dynamic Structural Monitoring. IEEE Sens. J. 2017, 17, 286–294. [Google Scholar] [CrossRef]
  30. Zhu, R.; Jiang, D.; Huang, Z.; Xie, L.; Zhang, D.; Fei, Q. Full-field modal identification using reliability-guided frequency-domain-based digital image correlation method based on multi-camera system. Measurement 2023, 211, 112567. [Google Scholar] [CrossRef]
  31. Elvira-Ortiz, D.A.; Romero-Troncoso, R.d.J.; Jaen-Cuellar, A.Y.; Morales-Velazquez, L.; Osornio-Rios, R.A. Vibration Suppression for Improving the Estimation of Kinematic Parameters on Industrial Robots. Shock Vib. 2016, 2016, 6954012. [Google Scholar] [CrossRef]
  32. Sun, X.; Jia, X. A Fault Diagnosis Method of Industrial Robot Rolling Bearing Based on Data Driven and Random Intuitive Fuzzy Decision. IEEE Access 2019, 7, 148764–148770. [Google Scholar] [CrossRef]
  33. Huang, X.; Huang, H.; Wu, Z. Development of a Variable-Frequency Hammering Method Using Acoustic Features for Damage-Type Identification. Appl. Sci. 2023, 13, 1329. [Google Scholar] [CrossRef]
  34. Zhao, C.; Tang, B.; Huang, Y.; Deng, L. Edge Collaborative Compressed Sensing in Wireless Sensor Networks for Mechanical Vibration Monitoring. IEEE Trans. Ind. Inform. 2023, 19, 8852–8864. [Google Scholar] [CrossRef]
  35. Rothberg, S.; Allen, M.; Castellini, P.; Di Maio, D.; Dirckx, J.; Ewins, D.; Halkon, B.; Muyshondt, P.; Paone, N.; Ryan, T.; et al. An international review of laser Doppler vibrometry: Making light work of vibration measurement. Opt. Lasers Eng. 2017, 99, 11–22. [Google Scholar] [CrossRef]
  36. Li, Y.; Dieussaert, E.; Baets, R. Miniaturization of Laser Doppler Vibrometers—A Review. Sensors 2022, 22, 4735. [Google Scholar] [CrossRef]
  37. Gao, C.; Wang, Q.; Wei, G.; Long, X. A Highly Accurate Calibration Method for Terrestrial Laser Doppler Velocimeter. IEEE Trans. Instrum. Meas. 2017, 66, 1994–2003. [Google Scholar] [CrossRef]
  38. Dao, F.; Zeng, Y.; Zou, Y.; Li, X.; Qian, J. Acoustic Vibration Approach for Detecting Faults in Hydroelectric Units: A Review. Energies 2021, 14, 7840. [Google Scholar] [CrossRef]
  39. Wang, Q.; Zhang, Y.; Cheng, S.; Wang, X.; Wu, S.; Liu, X. MEMS Acoustic Sensors: Charting the Path from Research to Real-World Applications. Micromachines 2025, 16, 43. [Google Scholar] [CrossRef]
  40. Nguyen, T.V.; Okamoto, Y.; Takeshita, T.; Takei, Y.; Okada, H.; Nguyen, K.; Phan, H.P.; Ichiki, M. Highly Sensitive Low-Frequency Acoustic Sensor Using Piezoresistive Cantilever. In Proceedings of the 2022 IEEE 35th International Conference on Micro Electro Mechanical Systems Conference (MEMS), Tokyo, Japan, 9–13 January 2022; pp. 841–844. [Google Scholar] [CrossRef]
  41. Yeo, W.; Matsumoto, M. Fault Diagnosis Systems for Robots: Acoustic Sensing-Based Identification of Detached Components for Fault Localization. Appl. Sci. 2025, 15, 6564. [Google Scholar] [CrossRef]
  42. Tanuska, P.; Spendla, L.; Kebisek, M.; Duris, R.; Stremy, M. Smart Anomaly Detection and Prediction for Assembly Process Maintenance in Compliance with Industry 4.0. Sensors 2021, 21, 2376. [Google Scholar] [CrossRef] [PubMed]
  43. Sabato, A.; Niezrecki, C.; Fortino, G. Wireless MEMS-Based Accelerometer Sensor Boards for Structural Vibration Monitoring: A Review. IEEE Sens. J. 2016, 17, 226–235. [Google Scholar] [CrossRef]
  44. Koene, I.; Klar, V.; Viitala, R. IoT connected device for vibration analysis and measurement. HardwareX 2020, 7, e00109. [Google Scholar] [CrossRef] [PubMed]
  45. Hasani, H.; Freddi, F.; Piazza, R.; Ceruffi, F. A Wireless Data Acquisition System Based on MEMS Accelerometers for Operational Modal Analysis of Bridges. Sensors 2024, 24, 2121. [Google Scholar] [CrossRef]
  46. Albarbar, A.; Mekid, S.; Starr, A.; Pietruszkiewicz, R. Suitability of MEMS Accelerometers for Condition Monitoring: An experimental study. Sensors 2008, 8, 784. [Google Scholar] [CrossRef]
  47. Ozevin, D. MEMS Acoustic Emission Sensors. Appl. Sci. 2020, 10, 8966. [Google Scholar] [CrossRef]
  48. Ali, W.R.; Prasad, M. Piezoelectric MEMS based acoustic sensors: A review. Sens. Actuators A Phys. 2020, 301, 111756. [Google Scholar] [CrossRef]
  49. Gemelli, A.; Tambussi, M.; Fusetto, S.; Aprile, A.; Moisello, E.; Bonizzoni, E.; Malcovati, P. Recent Trends in Structures and Interfaces of MEMS Transducers for Audio Applications: A Review. Micromachines 2023, 14, 847. [Google Scholar] [CrossRef] [PubMed]
  50. Ivanov, A. Simple in-system control of microphone sensitivities in an array. J. Sens. Sens. Syst. 2024, 13, 81–88. [Google Scholar] [CrossRef]
  51. Wilmott, D.; Alves, F.; Karunasiri, G. Bio-Inspired Miniature Direction Finding Acoustic Sensor. Sci. Rep. 2016, 6, 29957. [Google Scholar] [CrossRef]
  52. Ivancic, J.; Alves, F. Directional Multi-Resonant Micro-Electromechanical System Acoustic Sensor for Low Frequency Detection. Sensors 2024, 24, 2908. [Google Scholar] [CrossRef]
  53. Frankovský, P.; Delyová, I.; Sivák, P.; Bocko, J.; Živčák, J.; Kicko, M. Modal Analysis Using Digital Image Correlation Technique. Materials 2022, 15, 5658. [Google Scholar] [CrossRef]
  54. Shimasaki, K.; Aliansyah, Z.; Senoo, T.; Ishii, I.; Ito, T. Wide-area Operation Monitoring of Conveyors Using a Panoramic Vibration Camera. ISIJ Int. 2021, 61, 2587–2596. [Google Scholar] [CrossRef]
  55. Jiang, M.; Gu, Q.; Aoyama, T.; Takaki, T.; Ishii, I. Real-Time Vibration Source Tracking Using High-Speed Vision. IEEE Sens. J. 2017, 17, 1513–1527. [Google Scholar] [CrossRef]
  56. Kalach, F.E.; Farahani, M.; Wuest, T.; Harik, R. Real-time defect detection and classification in robotic assembly lines: A machine learning framework. Robot. Comput.-Integr. Manuf. 2025, 95, 103011. [Google Scholar] [CrossRef]
  57. O’Donoughue, P.; Gautier, F.; Meteyer, E.; Durand-Texte, T.; Secail-Geraud, M.; Foucart, F.; Robin, O.; Berry, A.; Melon, M.; Pezerat, C.; et al. Comparison of three full-field optical measurement techniques applied to vibration analysis. Sci. Rep. 2023, 13, 3261. [Google Scholar] [CrossRef]
  58. Arebi, L.; Gu, F.; Ball, A. A comparative study of misalignment detection using a novel Wireless Sensor with conventional Wired Sensors. J. Phys. Conf. Ser. 2012, 364, 012049. [Google Scholar] [CrossRef]
  59. Zhong, J.; Zhong, S.; Zhang, Q.; Liu, S.; Peng, Z.; Maia, N. Real-time three-dimensional vibration monitoring of rotating shafts using constant-density sinusoidal fringe pattern as tri-axial sensor. Mech. Syst. Signal Process. 2019, 115, 132–146. [Google Scholar] [CrossRef]
  60. Wei, Y.; Huang, Y.; Wu, H.; Wang, P.; Chen, B.; Gao, Z.; Fu, Y. Vibration monitoring of rotating shafts using DIC and compressed sensing. Opt. Laser Technol. 2025, 182, 112189. [Google Scholar] [CrossRef]
  61. Beberniss, T.; Ehrhardt, D. High-speed 3D digital image correlation vibration measurement: Recent advancements and noted limitations. Mech. Syst. Signal Process. 2016, 86, 35–48. [Google Scholar] [CrossRef]
  62. Zettel, S.; Norambuena, M.; Dewald, R.; Böswald, M. Comparison of laser doppler vibrometry and digital image correlation measurement techniques for applications in vibroacoustics. In Proceedings of the International Congress on Sound and Vibration (ICSV), Prague, Czech Republic, 9–13 July 2023. [Google Scholar]
  63. Qin, W.; Wang, F.; Hu, S.; Shimasaki, K.; Ishii, I. Model-Based 3-D Vibration Measurement Using Stereo High-Frame-Rate Images. IEEE Trans. Instrum. Meas. 2025, 74, 5040812. [Google Scholar] [CrossRef]
  64. Zhi-gang, C.; Wen-jie, L.; Xing-yu, C.; Xin-zheng, L. A vibration recognition method based on deep learning and signal processing. Eng. Mech. 2021, 38, 230–246. [Google Scholar] [CrossRef]
  65. Shimasaki, K.; Jiang, M.; Takaki, T.; Ishii, I. Real-time Multicopter Detection Using Pixel-level Digital Filters for Frame-Interpolated High-frame-rate Images. In Proceedings of the 2018 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Auckland, New Zealand, 9–12 July 2018; pp. 304–309. [Google Scholar] [CrossRef]
  66. Jiang, M.; Aoyama, T.; Takaki, T.; Ishii, I. Vibration source localization for motion-blurred high-frame-rate videos. In Proceedings of the 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), Qingdao, China, 3–7 December 2016; pp. 774–779. [Google Scholar] [CrossRef]
  67. Abudoureheman, T.; Wang, F.; Shimasaki, K.; Ishii, I. HFR-Video-Based Vibration Analysis of a Multi-Jointed Robot Manipulator. J. Robot. Mechatron. 2025, 37, 1205–1218. [Google Scholar] [CrossRef]
  68. Fedorova, D.; Tlach, V.; Kuric, I.; Dodok, T.; Zajačko, I.; Tucki, K. Technical Diagnostics of Industrial Robots Using Vibration Signals: Case Study on Detecting Base Unfastening. Appl. Sci. 2025, 15, 270. [Google Scholar] [CrossRef]
  69. Chen, C.; Peng, F.; Yan, R.; Li, Y.; Wei, D.; Fan, Z.; Tang, X.; Zhu, Z. Stiffness performance index based posture and feed orientation optimization in robotic milling process. Robot. Comput.-Integr. Manuf. 2019, 55, 29–40. [Google Scholar] [CrossRef]
  70. Thomsen, D.K.; Søe-Knudsen, R.; Balling, O.; Zhang, X. Vibration control of industrial robot arms by multi-mode time-varying input shaping. Mech. Mach. Theory 2021, 155, 104072. [Google Scholar] [CrossRef]
  71. Wu, J.; Zhang, D.; Liu, J.; Han, X. A Moment Approach to Positioning Accuracy Reliability Analysis for Industrial Robots. IEEE Trans. Reliab. 2020, 69, 699–714. [Google Scholar] [CrossRef]
  72. Tajima, S.; Iwamoto, S.; Yoshioka, H. Posture Optimization in Robot Machining with Kinematic Redundancy for High-Precision Positioning. Int. J. Autom. Technol. 2023, 17, 494–503. [Google Scholar] [CrossRef]
  73. Jiao, J.; Tian, W.; Zhang, L.; Li, B.; Hu, J.; Li, Y.; Li, D.; Zhang, J. Variable stiffness identification and configuration optimization of industrial robots for machining tasks. Chin. J. Mech. Eng. 2022, 35, 115. [Google Scholar] [CrossRef]
  74. D’Antona, A.; Farsoni, S.; Rizzi, J.; Bonfè, M. A Variable Stiffness System for Impact Analysis in Collaborative Robotics Applications with FPGA-Based Force and Pressure Data Acquisition. Sensors 2025, 25, 3913. [Google Scholar] [CrossRef] [PubMed]
  75. Guo, Y.; Dong, H.; Ke, Y. Stiffness-oriented posture optimization in robotic machining applications. Robot. Comput.-Integr. Manuf. 2015, 35, 69–76. [Google Scholar] [CrossRef]
  76. Badkoobehhezaveh, H.; Fotouhi, R.; Zhang, Q.; Bitner, D. Vibration Analysis of a 5-DOF Long-Reach Robotic Arm. Vibration 2022, 5, 585–602. [Google Scholar] [CrossRef]
  77. Bisu, C.; Cherif, M.; Gerard, A.; K’nevez, J.Y. Dynamic Behavior Analysis for a Six Axis Industrial Machining Robot. Adv. Mater. Res. 2011, 423, 65–76. [Google Scholar] [CrossRef]
  78. Birk, C.; Kipfmüller, M.; Kotschenreuther, J. Dynamic Modeling and Analysis of Industrial Robots for Enhanced Manufacturing Precision. Actuators 2025, 14, 311. [Google Scholar] [CrossRef]
  79. Bickel, V.T.; Manconi, A.; Amann, F. Quantitative Assessment of Digital Image Correlation Methods to Detect and Monitor Surface Displacements of Large Slope Instabilities. Remote Sens. 2018, 10, 865. [Google Scholar] [CrossRef]
  80. Lee, J.; Jeong, S.; Lee, Y.J.; Sim, S.H. Stress Estimation Using Digital Image Correlation with Compensation of Camera Motion-Induced Error. Sensors 2019, 19, 5503. [Google Scholar] [CrossRef]
  81. Siebert, T.; Wood, R.; Splitthof, K. High Speed Image Correlation for Vibration Analysis. J. Phys. Conf. Ser. 2009, 181, 012064. [Google Scholar] [CrossRef]
  82. Neri, P. Augmented-Resolution Digital image correlation algorithm for vibration measurements. Measurement 2024, 231, 114565. [Google Scholar] [CrossRef]
  83. Shahbaznia, M.; Raissi Dehkordi, M.; Mirzaee, A. An Improved Time-Domain Damage Detection Method for Railway Bridges Subjected to Unknown Moving Loads. Period. Polytech. Civ. Eng. 2020. [Google Scholar] [CrossRef]
  84. Zhou, K.; Tang, J. A wavelet neural network informed by time-domain signal preprocessing for bearing remaining useful life prediction. Appl. Math. Model. 2023, 122, 220–241. [Google Scholar] [CrossRef]
  85. Bian, L.; Gebraeel, N.; Kharoufeh, J.P. Degradation modeling for real-time estimation of residual lifetimes in dynamic environments. IIE Trans. 2015, 47, 471–486. [Google Scholar] [CrossRef]
  86. Wehrle, E.; Gufler, V.; Vidoni, R. Optimal In-Operation Redesign of Mechanical Systems Considering Vibrations—A New Methodology Based on Frequency-Band Constraint Formulation and Efficient Sensitivity Analysis. Machines 2020, 8, 11. [Google Scholar] [CrossRef]
  87. Kot, T.; Bobovský, Z.; Vysocký, A.; Krys, V.; Šafařík, J.; Ružarovský, R. Method for Robot Manipulator Joint Wear Reduction by Finding the Optimal Robot Placement in a Robotic Cell. Appl. Sci. 2021, 11, 5398. [Google Scholar] [CrossRef]
  88. Laine, S.; Haikonen, S.; Tiainen, T.; Viitala, R. Rotor resonance avoidance by continuous adjustment of support stiffness. Int. J. Mech. Sci. 2024, 270, 109092. [Google Scholar] [CrossRef]
  89. Sun, J.; Zhang, W.; Dong, X. Natural Frequency Prediction Method for 6R Machining Industrial Robot. Appl. Sci. 2020, 10, 8138. [Google Scholar] [CrossRef]
  90. Xiu, C.; Weng, Y.; Shi, W. Vision and Vibration Data Fusion-Based Structural Dynamic Displacement Measurement with Test Validation. Sensors 2023, 23, 4547. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Experiment setup and Environment.
Figure 1. Experiment setup and Environment.
Applsci 15 12771 g001
Figure 2. The robot is modeled with three distinct postures, which are assumed to represent realistic operational configurations.
Figure 2. The robot is modeled with three distinct postures, which are assumed to represent realistic operational configurations.
Applsci 15 12771 g002
Figure 3. Multi-point DIC measurements of the robot in three operational postures. The colors assigned to each ROI from top to bottom correspond to the colors of the labels on the right.
Figure 3. Multi-point DIC measurements of the robot in three operational postures. The colors assigned to each ROI from top to bottom correspond to the colors of the labels on the right.
Applsci 15 12771 g003
Figure 4. Ground-truth measurement Points.
Figure 4. Ground-truth measurement Points.
Applsci 15 12771 g004
Figure 5. Real-time vibration visualization of the robot in three different postures (top to bottom: Posture 1, Posture 2, Posture 3). The color maps were obtained from all scanned frequency data, enabling the calculation of vibration distribution at each frequency. In the color maps, colors closer to red indicate higher vibration levels, while those closer to blue indicate lower levels. The results show that, even for the same robot and the same frequencies, the vibration distribution is significantly influenced by the robot’s configurations.
Figure 5. Real-time vibration visualization of the robot in three different postures (top to bottom: Posture 1, Posture 2, Posture 3). The color maps were obtained from all scanned frequency data, enabling the calculation of vibration distribution at each frequency. In the color maps, colors closer to red indicate higher vibration levels, while those closer to blue indicate lower levels. The results show that, even for the same robot and the same frequencies, the vibration distribution is significantly influenced by the robot’s configurations.
Applsci 15 12771 g005
Figure 6. Displacement analysis over a 1 s interval for the three distinct postures revealed frequency components at 17 Hz, 30 Hz, and 80 Hz, corresponding to the input excitation frequencies. These results demonstrate the accuracy and reliability of DIC-based multi-point vibration measurements.
Figure 6. Displacement analysis over a 1 s interval for the three distinct postures revealed frequency components at 17 Hz, 30 Hz, and 80 Hz, corresponding to the input excitation frequencies. These results demonstrate the accuracy and reliability of DIC-based multi-point vibration measurements.
Applsci 15 12771 g006
Figure 7. Posture 1: power spectrogram analysis of the output/input ratio obtained from a 200-s HFR video, illustrating vibration transmission across the robot structure under this configuration.
Figure 7. Posture 1: power spectrogram analysis of the output/input ratio obtained from a 200-s HFR video, illustrating vibration transmission across the robot structure under this configuration.
Applsci 15 12771 g007
Figure 8. Posture 2: power spectrogram analysis of the output/input ratio obtained from a 200-s HFR video, illustrating vibration transmission across the robot structure under this configuration.
Figure 8. Posture 2: power spectrogram analysis of the output/input ratio obtained from a 200-s HFR video, illustrating vibration transmission across the robot structure under this configuration.
Applsci 15 12771 g008
Figure 9. Posture 3: power spectrogram analysis of the output/input ratio obtained from a 200-s HFR video, illustrating vibration transmission across the robot structure under this configuration.
Figure 9. Posture 3: power spectrogram analysis of the output/input ratio obtained from a 200-s HFR video, illustrating vibration transmission across the robot structure under this configuration.
Applsci 15 12771 g009
Figure 13. Part by part Function Identification of Posture 1.
Figure 13. Part by part Function Identification of Posture 1.
Applsci 15 12771 g013
Figure 14. Part by part Function Identification of Posture 2.
Figure 14. Part by part Function Identification of Posture 2.
Applsci 15 12771 g014
Table 1. Comparison of MEMS Accelerometer, LDV, MEMS Acoustic, and DIC Measurement Methods.
Table 1. Comparison of MEMS Accelerometer, LDV, MEMS Acoustic, and DIC Measurement Methods.
ItemMEMS AccelerometerLDVsMEMS AcousticDIC
QuantityAccelerationVelocitySound pressureFull-field
ContactContact(need attachment)Non-contactNon-contactNon-contact
InstallationMedium/ComplexSimpleSimpleSimple
RangeSingle-point/Multi-pointSingle-pointSingle-pointFull-field
AccuracyHighHighHighHigh
SpeedFastSlowFastFast
VisualizationLimitedReal-timeReal-timeReal-time
LimitationsMass-loading, wiringSensitiveBatteryLarge data
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Abudoureheman, T.; Otsubo, H.; Wang, F.; Shimasaki, K.; Ishii, I. High-Frame-Rate Camera-Based Vibration Analysis for Health Monitoring of Industrial Robots Across Multiple Postures. Appl. Sci. 2025, 15, 12771. https://doi.org/10.3390/app152312771

AMA Style

Abudoureheman T, Otsubo H, Wang F, Shimasaki K, Ishii I. High-Frame-Rate Camera-Based Vibration Analysis for Health Monitoring of Industrial Robots Across Multiple Postures. Applied Sciences. 2025; 15(23):12771. https://doi.org/10.3390/app152312771

Chicago/Turabian Style

Abudoureheman, Tuniyazi, Hayato Otsubo, Feiyue Wang, Kohei Shimasaki, and Idaku Ishii. 2025. "High-Frame-Rate Camera-Based Vibration Analysis for Health Monitoring of Industrial Robots Across Multiple Postures" Applied Sciences 15, no. 23: 12771. https://doi.org/10.3390/app152312771

APA Style

Abudoureheman, T., Otsubo, H., Wang, F., Shimasaki, K., & Ishii, I. (2025). High-Frame-Rate Camera-Based Vibration Analysis for Health Monitoring of Industrial Robots Across Multiple Postures. Applied Sciences, 15(23), 12771. https://doi.org/10.3390/app152312771

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop