Next Article in Journal
A Comparative Study of a Deep Reinforcement Learning Solution and Alternative Deep Learning Models for Wildfire Prediction
Previous Article in Journal
DEALER: Distributed Clustering with Local Direction Centrality and Density Measure
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Method of Estimating an Object’s Parameters Based on Simplication with Momentum for a Manipulator

Department of Mechanical Engineering, Hanyang University, 222 Wangsimni-ri, Seongdong-gu, Seoul 04763, Republic of Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(7), 3989; https://doi.org/10.3390/app15073989
Submission received: 6 March 2025 / Revised: 2 April 2025 / Accepted: 2 April 2025 / Published: 4 April 2025
(This article belongs to the Section Robotics and Automation)

Abstract

:
For a manipulator to estimate an object’s inertial parameters—such as mass, center of mass position, and elements of the inertia tensor—it must grasp one side of the object and generate motion while utilizing the resulting forces and torques at its end-effector for estimation. In most previous studies, the estimation motion has involved high acceleration, resulting in larger motion trajectories and increased inertial forces. A larger trajectory raises the risk of collisions, while greater inertial forces could potentially damage the object. This paper introduces an innovative approach that simplifies the dynamic model using momentum, enabling accurate parameter estimation with minimal motion in robotic manipulation. Simulations are conducted to estimate key inertial parameters of the target object, including mass, center of mass position, and elements of the inertia tensor. The robotic manipulator securely grasps one side of the object and induces controlled motions to facilitate the estimation process. A comparative analysis with previously established estimation methods demonstrates that the proposed approach achieves accurate results with significantly smaller motions than prior techniques. The maximum acceleration reduction rates are 95% in linear motion and 98% in angular motion, respectively.

1. Introduction

Numerous advanced control strategies are built upon a dynamic model as their foundation. In robotic object manipulation, accurately determining object-specific parameters—such as mass, center of mass position, and elements of the inertia tensor—enables the robot to perform more precise and efficient manipulation motions [1]. For consistent objects, these parameters can be obtained through iterative manipulation or predefined data provided by an operator. However, when dealing with diverse objects, the robot must estimate these parameters iteratively for each new object it encounters.
The development of strategies for robots to estimate inertial parameters has been widely studied, as summarized in a comprehensive survey [2], which classifies existing approaches into three main categories. The first approach estimates parameters using visual information and potential correlations between visual and inertial properties, often assuming uniform object density [3,4,5,6]. This method eliminates the risk of damaging the object during estimation. However, it has inherent limitations, as it relies on pre-established relationships between visual and inertial properties. This makes it unsuitable for estimating the parameters of objects with varying internal contents, such as a box containing different items.
The second approach involves instructing the robot to gently prod or nudge the object, and then calculating its parameters based on the forces applied by the robot and the resulting reactions from the object [7,8,9,10]. However, this technique requires strong assumptions about the robot’s environment and precise control over the interaction process. For example, when a robot pushes an object on a table and measures its motion, it must have accurate knowledge of the friction coefficients between the object and the table, as well as between the pushing mechanism and the object. Additionally, it must ensure no slippage occurs between the robot’s end-effector and the object’s surface.
The third approach involves securely attaching the object to the robot, initiating motion, and subsequently estimating the object’s parameters using end-effector wrenches and object motion based on its dynamics [11,12]. This method is relatively straightforward, as it requires only a secure connection between the robot and the object, achieved through a firm grasp or other suitable means. Numerous studies have explored this approach using various techniques. For instance, the authors of [13] estimated object parameters attached to a robot’s end-effector using an offline least squares method, similar to robot dynamics estimation [14,15,16,17], since the number of equations describing the object’s dynamics is smaller than the number of parameters to be estimated. The offline least squares method is simple and independent of initial conditions. However, it cannot adapt to parameter changes occurring during the process. To address this limitation, researchers have developed online estimation strategies that can accommodate parameter variations during tasks, utilizing recursive least squares [18,19,20,21] or extended Kalman filters [22,23]. While online methods can adjust to parameter changes in real time, improper selection of initial conditions may lead to inaccurate estimation results. Recent advancements have incorporated visual data [24], generalized momentum observers with filters [25], and machine learning techniques [26,27,28,29] to improve estimation performance. Parameter estimation methods have also been applied in vehicle research to enhance electronic control systems, contributing to improved vehicle safety and passenger comfort [30,31,32,33]. This highlights the broad applicability of parameter estimation, extending beyond robotics into various other domains.
However, these approaches face specific challenges due to the underconstrained nature of the problem, where the number of parameters to be estimated exceeds the available dynamic equations. Additionally, the force and torque data required for estimation depend on acceleration data, which are typically obtained by differentiating velocity with respect to time. This differentiation process amplifies noise, making precise estimation more difficult. To mitigate these issues, researchers have attempted to use high acceleration to reduce estimation error. However, increasing acceleration often results in larger motion during estimation, raising the risk of collisions between the object and the robot’s links or surrounding structures. Moreover, if the object has substantial inertial parameters, achieving the desired acceleration becomes challenging. Even if high acceleration is attained, the resulting inertial forces could weaken the coupling between the robot and the object, potentially causing the grasp to slip or even damaging the object during estimation.
This paper introduces a novel approach for estimating the inertial parameters of a grasped object by simplifying the estimation model using momentum. By integrating force and moment data over time and assuming a zero initial condition, the method reduces noise and derives linear and angular momentum. To address the underconstrained nature of the estimation model, the object’s parameters are estimated separately, ensuring that the number of parameters aligns with the available dynamic equations. These parameters include the object’s mass, center of mass, and inertia tensor. The estimation process is structured so that each parameter is independently determined using distinct linear and rotational motions, with rotational motions applied separately along each axis. The key contribution of this study is its ability to achieve accurate parameter estimation with significantly smaller motion ranges, velocities, and accelerations compared to previous research. The effectiveness of the proposed method is validated through comparative analyses against established parameter estimation techniques.
The subsequent sections of the paper are organized as follows: Section 2 formulates the estimation model, while Section 3 describes the suggested parameter estimation method. The evaluation of the suggested method is presented in Section 4, and the results of the simulations are discussed in Section 5.

2. System Modeling

Assuming the robot securely grasps an object with its gripper installed at its end-effector to facilitate smooth motions, as shown in Figure 1, coordinate frames {O}, {R}, and {G} represent the robot’s base, the robot’s end-effector, and the center of mass (COM) of the object, respectively. The force and torque applied at the robot’s end-effector will affect the motion of both the gripper and the object. By utilizing the specifications or CAD data of the gripper provided by the manufacturer, the parameters of the gripper can be predetermined, thereby enabling calculation of the parameters of the object.
With {R} and {G} fixed to the object and the gripper securely attached to the robot’s end-effector, a constant relative kinematic relationship between {R} and {G} is maintained throughout the parameter estimation process, ensuring their parallel alignment. The object has a mass denoted by m, and the robot exerts forces ( f R ) and torques ( τ R ) on it. Given this grip, the positions of the end-effector and the object’s center of mass (COM) are represented by r R and r G , respectively, while the linear acceleration of the COM is denoted by a G . The object’s dynamics can then be described by
f R = m a G g
τ R , b + r G / R , b × f R , b = I G , b α b + ω b × I G , b ω b ,
where subscript b denotes that the corresponding vector is expressed using the body coordinate frames attached to the object, g = 0 0 1 T g denotes the gravitational acceleration vector with gravitational acceleration g, and r G / R = r G r R . I G , b represents the inertia tensor matrix measured with the coordinate frames {G}, defined by
I G , b = I x x I x y I x z I x y I y y I y z I x z I y z I z z .
Furthermore, α and ω denote the angular acceleration and velocity of the object, respectively.
The position of COM, r G , can be expressed as
r G = r R + R r G / R , b ,
where R represents the rotation matrix from {G} to {O}. By differentiating Equation (4) with respect to time,
v G r ˙ G = v R + ω × R r G / R , b .
By further differentiation of Equation (5),
a G v ˙ G = a R + α × R r G / R , b + ω × ω × R r G / R , b ,
where v R r ˙ R and a R v ˙ R .

3. Estimation Algorithms

The proposed approach employs a strategy to estimate inertial parameters, consisting of m, r G / R , b , and I = I x x I x y I x z I y y I y z I z z T . To address the underconstrained nature of the estimation model, these parameters will be estimated separately in three steps, with some of them estimated by simplifying the model by time integration.
Step1: The mass of the object can be easily determined when the robot holds it stationary:
m = 0 0 1 f R / g .
Step 2: The position of the COM of the object is estimated through a linear motion, with α = ω = 0 in Equation (2), in which {O}, {R}, and {G} remain parallel each other.
τ R , b = r G / R , b × f R , b f R , b × T r G / R , b ,
where f R , b × is the cross product matrix of f R , b .
The least squares method is employed for the estimation of the position of COM of the object due to the singularity of the cross product matrix and the presence of noise in the measured forces and torques. With n measurements of τ R , b and f R , b , we can define Y , Φ , and E as
Y τ R , b ( 1 ) τ R , b ( 2 ) τ R , b ( n ) = f R , b ( 1 ) × T f R , b ( 2 ) × T f R , b ( n ) × T r G / R , b + e ( 1 ) e ( 2 ) e ( n ) Φ r G / R , b + E ,
where τ R , b ( k ) and f R , b ( k ) are the kth measurement of torque and force, respectively; e ( k ) is the kth measurement error caused by noise or mechanical vibration induced by the robot; and n is the number of measurements taken during the estimation process. By using the least square method to minimize E T E ,
r G / R , b = Φ T Φ 1 Φ T Y .
Step 3: The elements of the inertia tensor matrix are determined from the angular momentum H G , which can be expressed as
H G = I G ω .
Since H ˙ G = M G , by assuming a zero initial condition, H G can be calculated with the following equation:
H G = τ R + r G / R × f R d t .
The rotational motion is sequentially applied in the x-, y-, and z-directions. For example, if the object rotates solely in the x-direction, H G , b R T H G can be expressed as
H G , b , x H G , b , y H G , b , z = I x x I x y I x z ω b , x .
Subsequently, I x x , I x y , and I x z can be easily computed by dividing H G , b by ω x , b . However, the external disturbances, such as measurement sensor noise or mechanical vibration induced by the robot, will amplify the estimation error. Additionally, if ω x , b is too small, dividing H G , b by ω x , b can yield unexpected results. To mitigate the effect of such disturbances, we employed the least squares method by modifying Equation (13) to:
H G , b , x H G , b , y H G , b , z = ω b , x 0 0 0 ω b , x 0 0 0 ω b , x I x x I x y I x z .
Then, by following similar procedures as outlined in Equations (9) and (10) by setting Y and Φ as:
Y = H G , b ( 1 ) T , H G , b ( 2 ) T , , H G , b ( n ) T T
Φ = ω b , x ( 1 ) I 3 , ω b , x ( 2 ) I 3 , , ω b , x ( n ) I 3 T ,
where I 3 is a 3 × 3 identity matrix, with Equations (14)–(16), I x x , I x y , and I x z can be estimated. The same methodology is applied when the object rotates in the y- and z-directions. In these instances, I x y , I x z , and I y z are estimated twice, and their values are averaged from the two estimations.

4. Parameter Estimation Simulation

4.1. Simulation Setup

The simulations for parameter estimation were conducted using RecurDyn (V8R5), a commercial software package, with the robot model being the UR16e by Universal Robots. As shown in Figure 2, the robot estimates the object parameters by securely grasping and shaking one side of it while its second, third, and fourth joints are positioned at orthogonal angles, forming an elbow-up configuration. A force torque sensor located at the robot’s end-effector measures the force and torque at {R}. Standard deviations of zero-mean Gaussian distributions were utilized to generate noise for the simulation experiments, as detailed at Table 1, selected based on [24]. Three objects with distinct parameters were employed in the simulations, with the actual parameters of each object provided in Table 2. The objects were chosen to have parameters within the maximum allowed payload of the UR16e [34]. The simulation time step is set to 1 ms.
The trajectory utilized for the estimation with the proposed method involves guiding the robot’s end-effector to move linearly in all directions and then rotate the end-effector separately in the x-, y-, and z-directions. As outlined in Section 3, the robot initially estimates the mass of the object while holding it stationary. Subsequently, linear motion is generated in all directions during the estimation of r G / R , b , and rotational motion in the x-, y-, and z-directions is carried out separately during the estimation of I with a designated motion range. The term ‘motion range’ refers to the extent of linear and rotational motion generated by the robot, expressed in meters and radians, respectively. For instance, if the robot generates a motion with a motion range of 0.1, it generates the motion that moves the grasped object linearly by 0.1 m in all directions and then rotates the object by 0.1 radians in each of the x-, y-, and z-directions separately. In this study, the robot executes estimation motion with motion range of 0.1, 0.05, and 0.025. The trajectory for this motion entails the robot’s end-effector moving linearly and angularly to a certain motion range and then returning within 2 s, utilizing the minimum jerk trajectory [35]. During the simulations, the robot only possesses knowledge of its own motion and forces and torques of {R}.

4.2. Simulation Result and Analysis

The simulation results are presented in Table 3, Table 4, Table 5 and Table 6. Table 3, Table 4 and Table 5 depict the error between the actual and estimated values by each motion range and noise level. Specifically, Table 3 presents the estimation error for Object 1, Table 4 for Object 2, and Table 5 for Object 3. The estimation error is quantified using the following equations:
e m = m ^ m m × 100 %
e r G / R , b , i = r ^ G / R , b , i r G / R , b , i r G / R , b , i × 100 %
e I i = I ^ i I i I i × 100 % ,
where m ^ , r ^ G / R , b , i , and I ^ i represent the estimated values of mass, the elements of r G / R , b along each axis, and each element of I , respectively. Table 6 presents the range of joint angles, defined as the difference between the maximum and minimum angles of each joint while the robot generates motion, along with the maximum magnitude of joint velocities and accelerations during the estimation process with the proposed method. The maximum joint velocity of UR16e is 2 3 π rad/s (120 °/s) for the first and second joints and π rad/s (180 °/s) for all other joints [34]. None of the motion range exceeded these limits. The average time taken to estimate the parameters with the taken measurements was 0.31 s.
The results reveal that m and r G / R , b were estimated with the maximum estimation error of 9.8% in all motion range and noise levels, indicating a low estimation error. However, the estimation error of I increased as the motion range decreased. This is attributed to the decrease in signal-to-noise ratio (SNR) as the motion range decreases, even though noise from the force-torque sensor was minimized by integrating the data over time to neglect noisy acceleration data. Interestingly, reducing the motion range of linear motion to 0.025 m did not appear to affect the estimation error of m and r G / R , b , likely due to the different units for linear and angular motion.

4.3. Comparison with Previous Studies

The previous studies compared with the proposed method are [13,18,26]. Due to the use of different robots in these studies, direct comparison of the estimation results is challenging. Additionally, the differences in the degrees of freedom of the robots make it difficult to apply the same estimation motion across all platforms. Therefore, the accuracy of the previous methods is acknowledged, and the comparison focuses on the estimation motion, demonstrating that the proposed method achieves precise estimates with smaller motions than the earlier approaches.
In [13], estimation was conducted using a PUMA 600 robot, with a minimum jerk trajectory for the estimation motion, utilizing the least squares method. The total estimation time was 2 s.
In [18], object parameters were estimated by integrating force and moment data over time, using the least squares method. The joint trajectory was defined by a 5th-order polynomial, though exact values for joint velocity and acceleration were not provided. Therefore, the maximum joint velocity and acceleration are calculated by assuming a 5th-order polynomial trajectory with constraints ensuring zero initial and final velocity and acceleration. The estimation process took 1.83 s using an IRp-6 robot.
In [26], object parameters were estimated using machine learning with maximun-likelihood estimation. The estimation motion followed sinusoidal functions, with different motion range and frequencies for each joint’s trajectory. A Franka Emika Panda robot was used for this estimation.
The estimation motions are summarized in Table 7 and Table 8. Table 7 shows the range of linear and angular displacements of the robot’s end-effector, along with the maximum velocities and accelerations observed during the estimation process in previous studies and in this study with a motion range of 0.1. Table 8 provides the range of joint angles and the corresponding maximum joint velocities and accelerations. To calculate the end-effector motions from previous studies, the kinematic parameters from [36,37,38] were used. Additionally, joint angles originally reported in degrees in [13,18] were converted to radians for consistency. The object dynamics used for the estimation in these studies are detailed in Appendix A.
Table 7 and Table 8 indicate that the proposed method achieved accurate parameter estimation with smaller estimation motions compared to previous studies. Although certain directional accelerations in the prior studies shown in Table 7 are lower than those in the proposed method, the accelerations in other directions are considerably higher. Furthermore, the proposed method separates linear and rotational motions, with rotational motions applied independently to each axis, whereas previous methods applied multi-directional linear and angular motions simultaneously. This approach allows the proposed method to achieve precise estimation with significantly smaller motions compared to previous techniques, thereby reducing the inertial forces applied to the object and minimizing the risk of collision during the estimation process. The maximum acceleration difference between the proposed method and previous methods is 1172.3 cm/s2 in the linear x-direction and 23,823 mrad/s2 in the angular y-direction when comparing the proposed method and the method in [26]. The reduction rates are 95% and 98%, respectively.
A comparison of joint motions in Table 6 and Table 8 demonstrates that the proposed method is both safer and more adaptable than previous approaches. The smaller range of joint motion in the proposed method reduces the risk of collisions between the manipulator’s links during estimation. Additionally, the lower maximum joint velocity and acceleration indicate that the proposed method can be applied to manipulators with less power.
If estimation is required in environments with high noise, increasing the motion range to 0.2 radians or more may enhance accuracy. However, the proposed method requires more estimation time compared to previous methods. While earlier approaches typically estimated parameters within 1–2 or 5 s, the proposed method required 10 s, reflecting a trade-off between safety and estimation time.

5. Conclusions

This paper presents a method for accurately estimating the parameters of a manipulated object using minimal motion. By leveraging momentum, the estimation model is simplified to focus on key parameters, including the object’s mass, the position of its center of mass (COM) relative to the robot’s end-effector, and elements of the object’s inertia tensor matrix.
The effectiveness of the proposed method is validated through simulations that compare its results with those of previous research. Unlike earlier approaches that rely on high-acceleration motions for parameter estimation, the proposed method uses a momentum-based approach with smaller motions.
The results indicate that the proposed method achieves accuracy comparable to previous studies while requiring significantly smaller motions. As a result, the inertial forces generated during estimation are lower, enhancing safety. Additionally, the reduced motion range allows the proposed method to be applied to manipulators with less power. This improvement stems from the simplified estimation model, which effectively addresses the underconstrained nature of the problem by incorporating momentum. Furthermore, in environments with higher noise levels than those considered in this study, increasing the estimation motion range to 0.2 radians or slightly higher could potentially improve accuracy.
However, the proposed method has two limitations. First, it requires more time for estimation compared to previous methods. Second, it necessitates the robot to grasp and shake one side of the object, adding to the payload and potentially limiting the robot’s maximum payload capacity during manipulation. While parameter estimation during collaborative manipulation with a human or another robot could help overcome this limitation, it introduces new challenges due to disturbances from external forces and torques generated by the collaborating partner. Future research will explore solutions to these challenges.

Author Contributions

Conceptualization, methodology, software, validation, formal analysis, investigation, resources, data curation, writing—original draft preparation, visualization, J.J.; writing—review and editing, supervision, J.H.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

The object dynamics used for the estimation in [13,18,26] are expressed as
W = K Φ ,
where
W = f R T , τ R , b T T ,
K = a R g α × + ω × ω × 0 3 × 6 0 1 × 3 g b a R , b · α b + ω b × · ω b ,
Φ = m , m r x , m r y , m r z , I x x , I x y , I x z , I y y , I y z , I z z T ,
where
· ω b = ω b , x ω b , y ω b , z 0 0 0 0 ω b , x 0 ω b , y ω b , z 0 0 0 ω b , x 0 ω b , y ω b , z .

References

  1. Khalil, W.; Dombre, E. Modeling, Identification and Control of Robots, 3rd ed.; Taylor and Francis: London, UK, 2002. [Google Scholar]
  2. Mavrakis, N.; Stolkin, R. Estimation and exploitation of objects’ inertial parameters in robotic grasping and manipulation: A survey. Robot. Auton. Syst. 2020, 124, 103374. [Google Scholar]
  3. Chien, C.H.; Aggarwal, J.K. Identification of 30 objects from multiple slihouettes using quadtrees/octrees. Comput. Vision Graph. Image Process. 1986, 36, 256–272. [Google Scholar] [CrossRef]
  4. Mirtich, B. Fast and accurate computation of polyhedral mass properties. J. Graph. Tools 1996, 1, 31–50. [Google Scholar]
  5. Lines, J.A.; Tillett, R.D.; Ross, L.G.; Chan, D.; Hockaday, S.; McFarlane, N.J.B. An automatic image-based system for estimating the mass of free-swimming fish. Comput. Electron. Agric. 2001, 31, 151–168. [Google Scholar]
  6. Omid, M.; Khojastehnazhand, M.; Tabatabaeefar, A. Estimating volume and mass of citrus fruits by image processing technique. J. Food Eng. 2010, 100, 315–321. [Google Scholar]
  7. Yu, Y.; Arima, T.; Tsujio, S. Estimation of object inertia parameters on robot pushing operation. In Proceedings of the International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005; pp. 1657–1662. [Google Scholar]
  8. Methil, N.S.; Mukherjee, R. Pushing and steering wheelchairs using a holonomic mobile robot with a single arm. In Proceedings of the International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 5781–5785. [Google Scholar]
  9. Artashes, M.; Burschka, D. Visual estimation of object density distribution through observation of its impulse response. In Proceedings of the International Conference on Computer Vision Theory and Applications, Barcelona, Spain, 21–24 February 2013; pp. 586–595. [Google Scholar]
  10. Franchi, A.; Petitti, A.; Rizzo, A. Distributed estimation of the inertial parameters of an unknown load via multi-robot manipulation. In Proceedings of the Conference on Decision and Control, Los Angeles, CA, USA, 15–17 December 2014; pp. 6111–6116. [Google Scholar]
  11. Wu, J.; Wang, J.; You, Z. An overviel of dynamic parameter identification of robots. Robot. -Comput.-Integr. Manuf. 2010, 26, 414–419. [Google Scholar] [CrossRef]
  12. Ljung, L. System Identification: Theory for the User, 2nd ed; Prentice-Hall: Hoboken, NJ, USA, 1999. [Google Scholar]
  13. Atkeson, C.G.; An, C.H.; Hollerbach, J.M. Estimation of inertial parameters of manipulator loads and links. Int. J. Robot. Res. 1986, 5, 101–119. [Google Scholar] [CrossRef]
  14. Radkhah, K.; Kulic, D.; Croft, E. Dynamic Parameter Identification for the CRS A460 Robot. In Proceedings of the International conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October–2 November 2007; pp. 3842–3847. [Google Scholar]
  15. Choi, J.S.; Yoon, J.H.; Park, J.H.; Kim, P.J. A numerical algorithm to identify independent grouped parameters of robot manipulator for control. In Proceedings of the Advanced Intelligent Mechatronics, Budapest, Hungary, 4–6 July 2011; pp. 373–378. [Google Scholar]
  16. Swevers, J.; Verdonck, W.; Schutter, J.D. Dynamic Model Identification for Industrial Robots. IEEE Control Syst. Mag. 2007, 27, 58–71. [Google Scholar]
  17. Bahloul, A.; Tliba, S.; Chitour, Y. Dynamic Parameters Identification of an Industrial Robot with and Without Payload. IFAC-Pap. 2018, 51, 443–448. [Google Scholar]
  18. Dutkiewicz, P.; Kozlowski, K.R.; Wroblewski, W.S. Experimental identification of robot and load dynamic parameters. In Proceedings of the Conference on Control Applications, Vancouver, BC, Canada, 13–16 September 1993; pp. 767–776. [Google Scholar]
  19. Traversaro, S.; Brossette, S.; Escande, A.; Nori, F. Identification of fully physical consistent inertial parameters using optimization on manifolds. In Proceedings of the International Conference on Intelligent Robots and Systems, Daejeon, Republic of Korea; 2016; pp. 5446–5451. [Google Scholar]
  20. Wensing, P.M.; Kim, S.; Slotine, J.J.E. Linear matrix inequalities for physically consistent inertial parameter identification: A statistical perspective on the mass distribution. IEEE Robot. Autom. Lett. 2018, 3, 60–67. [Google Scholar] [CrossRef]
  21. Cehajic, D.; Dohmann, P.B.; Hirche, S. Estimating unknown object dynamics in human-robot manipulation tasks. In Proceedings of the International Conference on Robotics and Automation, Singapore, 29 May–3 June 2017; pp. 1730–1737. [Google Scholar]
  22. Jang, J.; Park, J.H. Parameter identification of an unknown object in human-robot collaborative manipulation. In Proceedings of the International Conference on Control, Automation and Systems, Busan, Republic of Korea, 13–16 October 2020; pp. 1086–1091. [Google Scholar]
  23. Park, J.; Shin, Y.S.; Kim, S. Object-Aware Impedance Control for Human–Robot Collaborative Task With Online Object Parameter Estimation. IEEE Trans. Autom. Sci. Eng. 2024, 22, 8081–8094. [Google Scholar] [CrossRef]
  24. Nadeau, P.; Giamou, M.; Kelly, J. Fast Object Inertial Parameter Identification for Collaborative Robots. In Proceedings of the International Conference on Robotics and Automation, Philadelphia, PA, USA, 23–27 May 2022; pp. 3560–3566. [Google Scholar]
  25. Kurdas, A.; Hamad, M.; Vorndamme, J.; Mansfeld, N.; Abdolshah, S.; Haddadin, S. Online Payload Identification for Tactile Robots Using the Momentum Observer. In Proceedings of the International Conference on Robotics and Automation, Philadelphia, PA, USA, 23–27 May 2022; pp. 5953–5959. [Google Scholar]
  26. Pavlic, M.; Markert, T.; Matich, S.; Burschka, D. RobotScale: A Framework for Adaptable Estimation of Static and Dynamic Object Properties with Object-dependent Sensitivity Tuning. In Proceedings of the International Conference on Robot and Human Interactive Communication, Busan, Republic of Korea, 28–31 August 2023; pp. 668–674. [Google Scholar]
  27. Taie, W.; ElGeneidy, K.; L-Yacoub, A.A.; Ronglei, S. Online Identification of Payload Inertial Parameters Using Ensemble Learning for Collaborative Robots. IEEE Robot. Autom. Lett. 2024, 9, 1350–1356. [Google Scholar] [CrossRef]
  28. Baek, D.; Peng, B.; Gupta, S.; Ramos, J. Online Learning-Based Inertial Parameter Identification of Unknown Object for Model-Based Control of Wheeled Humanoids. IEEE Robot. Autom. Lett. 2024, 9, 11154–11161. [Google Scholar] [CrossRef]
  29. Shan, S.; Pham, Q.C. Fast Payload Calibration for Sensorless Contact Estimation Using Model Pre-Training. IEEE Robot. Autom. Lett. 2024, 9, 9007–9014. [Google Scholar] [CrossRef]
  30. Wenzel, T.A.; Burnham, K.J.; Blundell, M.V.; Williams, R.A. Dual extended Kalman filter for vehicle state and parameter estimation. Veh. Syst. Dyn. 2006, 44, 151–171. [Google Scholar] [CrossRef]
  31. Best, M.C.; Newton, A.P.; Tuplin, S. The identifying extended Kalman filter: Parameteric system identification of a vehicle handling model. Proc. Inst. Mech. Eng. Part J. -Multi-Body Dyn. 2007, 221, 87–98. [Google Scholar] [CrossRef]
  32. Hong, S.; Lee, C.; Borrelli, F.; Hendrick, K. A novel approach for vehicle inertial parameter identification using a dual Kalman filter. Intell. Transp. Syst. IEEE Trans. 2015, 16, 151–160. [Google Scholar] [CrossRef]
  33. Yang, B.; Fu, R.; Sun, Q.; Jiang, S.; Wang, C. State estimation of buses: A hybrid algorithm of deep neural network and unscented Kalman filter considering mass identification. Mech. Syst. Signal Process. 2024, 213, 111368. [Google Scholar] [CrossRef]
  34. Universal Robots e-Series User Manual UR16e Original instructions (en); Universal Robots: Odense, Denmark, 2021.
  35. Flash, T.; Hogan, N. The Coordination of Arm Movements: An Experimentally Confirmed Mathematical Model. J. Neurosci. 1985, 5, 1688–1703. [Google Scholar] [CrossRef]
  36. Bazerghi, A.; Goldenberg, A.A.; Apkarian, J. An Exact Kinematic Model of PUMA 600 Manipulator. IEEE Trans. Syst. Man, Cybern. 1984, SMC-14, 483–487. [Google Scholar] [CrossRef]
  37. Szkodny, T. Modelling of Kinematics of the IRb-6 Manipulator. Comput. Math. Appl. 1995, 29, 77–94. [Google Scholar] [CrossRef]
  38. Shen, Y.; Jia, Q.; Wang, R.; Huang, Z.; Chen, G. Learning-Based Visual Servoing for High-Precision Peg-in-Hole Assembly. Actuators 2023, 12, 144. [Google Scholar] [CrossRef]
Figure 1. Free-body diagram of a carried object.
Figure 1. Free-body diagram of a carried object.
Applsci 15 03989 g001
Figure 2. The robot grasping a side of each object for parameter estimation.
Figure 2. The robot grasping a side of each object for parameter estimation.
Applsci 15 03989 g002
Table 1. Standard deviations of Gaussian noise used in the simulation experiments.
Table 1. Standard deviations of Gaussian noise used in the simulation experiments.
Angular Velocity (mrad/s)Force (mN)Torque (mNm)
Low Noise0.25502.50
Moderate (Mode) Noise0.501005
High Noise1.003306.70
Table 2. Acual parameters of the objects.
Table 2. Acual parameters of the objects.
ParameterObject 1Object 2Object 3
m (kg)8.12 ×   10 0 9.06 ×   10 0 1.02 ×   10 1
r G / R , b , x (m)2.62 ×   10 1 −1.26 ×   10 1 1.88 ×   10 1
r G / R , b , y (m)−1.14 ×   10 1 −2.70 ×   10 1 −5.20 ×   10 2
r G / R , b , z (m)3.20 ×   10 2 4.50 ×   10 2 1.70 ×   10 1
I x x (kg·m2)2.39 ×   10 1 3.42 ×   10 1 4.43 ×   10 1
I x y (kg·m2)1.40 ×   10 2 −3.00 ×   10 2 2.70 ×   10 2
I x z (kg·m2)−2.30 ×   10 2 1.15 ×   10 1 −1.11 ×   10 1
I y y (kg·m2)2.93 ×   10 1 3.00 ×   10 1 4.43 ×   10 1
I y z (kg·m2)8.60 ×   10 2 3.80 ×   10 2 1.11 ×   10 1
I z z (kg·m2)3.03 ×   10 1 3.52 ×   10 1 1.96 ×   10 1
Table 3. Estimation error for Object 1.
Table 3. Estimation error for Object 1.
Estimation
Errors
(%)
Motion Range
0.10.050.025
Noise LevelNoise LevelNoise Level
LowModeHighLowModeHighLowModeHigh
e m 0.00.00.00.00.00.00.00.00.0
e r G / R , b , x 0.10.10.10.10.10.10.10.10.1
e r G / R , b , y 0.40.40.40.40.40.40.40.40.3
e r G / R , b , z 6.26.24.87.06.61.75.75.15.3
e I x x 3.33.01.93.43.70.63.01.89.0
e I x y 1.11.79.04.20.416.36.36.442.1
e I x z 0.21.211.51.97.613.03.33.214.5
e I y y 3.63.61.34.84.30.33.54.75.2
e I y z 1.20.94.81.41.69.31.03.614.8
e I z z 0.10.02.00.10.22.00.50.23.0
Table 4. Estimation error for Object 2.
Table 4. Estimation error for Object 2.
Estimation
Errors
(%)
Motion Range
0.10.050.025
Noise LevelNoise LevelNoise Level
LowModeHighLowModeHighLowModeHigh
e m 0.00.00.00.00.00.00.00.00.0
e r G / R , b , x 0.40.40.40.40.40.40.40.40.3
e r G / R , b , y 0.10.10.10.10.10.10.10.10.1
e r G / R , b , z 0.30.12.60.10.53.80.71.89.8
e I x x 0.20.95.60.60.87.81.22.07.3
e I x y 0.23.04.52.81.223.58.25.675.5
e I x z 2.31.96.42.62.06.12.06.418.8
e I y y 0.00.24.00.70.38.20.81.622.5
e I y z 0.31.415.92.19.426.91.10.941.0
e I z z 0.10.62.50.41.00.11.33.04.7
Table 5. Estimation error for Object 3.
Table 5. Estimation error for Object 3.
Estimation
Errors
(%)
Motion Range
0.10.050.025
Noise LevelNoise LevelNoise Level
LowModeHighLowModeHighLowModeHigh
e m 0.10.10.10.10.10.10.10.10.1
e r G / R , b , x 0.10.10.10.10.10.10.10.10.1
e r G / R , b , y 0.50.50.50.50.50.50.50.50.5
e r G / R , b , z 1.01.00.41.01.01.80.90.07.6
e I x x 3.03.30.63.33.29.83.61.135.8
e I x y 0.74.98.30.13.422.14.48.818.8
e I x z 0.00.31.00.90.49.62.40.820.9
e I y y 4.54.61.64.35.110.44.50.033.5
e I y z 3.22.51.92.90.40.20.12.89.2
e I z z 0.60.21.10.61.83.71.40.47.3
Table 6. Range of joint angles and maximum joint velocity and acceleration during the estimation by each motion range.
Table 6. Range of joint angles and maximum joint velocity and acceleration during the estimation by each motion range.
Joints0.10.050.025
LinearAngularLinearAngularLinearAngular
Range of joint motion (rad) q 1 2.44 ×   10 1 2.41 ×   10 2 1.13 ×   10 1 1.21 ×   10 2 5.43 ×   10 2 6.10 ×   10 3
q 2 6.51 ×   10 2 2.61 ×   10 2 5.03 ×   10 2 1.26 ×   10 2 2.94 ×   10 2 6.20 ×   10 3
q 3 2.20 ×   10 1 5.80 ×   10 2 9.08 ×   10 2 2.89 ×   10 2 4.07 ×   10 2 1.44 ×   10 2
q 4 2.84 ×   10 1 1.31 ×   10 1 1.41 ×   10 1 6.64 ×   10 2 7.01 ×   10 2 3.33 ×   10 2
q 5 6.60 ×   10 4 9.99 ×   10 2 3.37 ×   10 4 5.00 ×   10 2 1.62 ×   10 4 2.50 ×   10 2
q 6 2.44 ×   10 1 1.24 ×   10 1 1.13 ×   10 1 6.22 ×   10 2 5.43 ×   10 2 3.11 ×   10 2
Maximum joint velocity (rad/s) q 1 4.64 × 10 1 4.54 × 10 2 2.14 × 10 1 2.28 × 10 2 1.02 × 10 1 1.14 × 10 2
q 2 1.62 × 10 1 4.90 × 10 2 1.74 × 10 1 2.37 × 10 2 5.58 × 10 2 1.17 × 10 2
q 3 4.32 × 10 1 1.09 × 10 1 9.08 × 10 2 5.42 × 10 2 7.69 × 10 2 2.71 × 10 2
q 4 5.34 × 10 1 2.48 × 10 1 2.65 × 10 1 1.24 × 10 1 1.31 × 10 1 6.23 × 10 2
q 5 1.30 × 10 3 1.87 × 10 1 6.06 × 10 4 9.37 × 10 2 2.92 × 10 4 4.69 × 10 2
q 6 4.63 × 10 1 1.87 × 10 1 2.13 × 10 1 9.37 × 10 2 1.02 × 10 1 4.69 × 10 2
Maximum joint acceleration (rad/s2) q 1 1.67 × 10 0 1.54 × 10 1 7.46 × 10 1 7.71 × 10 1 3.51 × 10 1 3.86 × 10 2
q 2 7.00 × 10 1 1.59 × 10 1 3.67 × 10 1 7.81 × 10 1 1.90 × 10 1 3.88 × 10 2
q 3 1.60 × 10 0 3.36 × 10 1 6.03 × 10 1 1.67 × 10 1 2.54 × 10 1 8.35 × 10 2
q 4 1.66 × 10 0 7.67 × 10 1 8.22 × 10 1 3.82 × 10 1 4.07 × 10 1 1.92 × 10 1
q 5 2.97 × 10 2 5.78 × 10 1 1.29 × 10 2 2.89 × 10 1 6.30 × 10 3 1.45 × 10 1
q 6 1.55 × 10 0 5.77 × 10 1 6.88 × 10 1 2.89 × 10 1 3.22 × 10 1 1.43 × 10 1
Table 7. Range of displacement, maximum velocity, and acceleration of the end-effector during the estimation in previous studies and this study with motion range of 0.1. The lowest value in each direction is highlighted.
Table 7. Range of displacement, maximum velocity, and acceleration of the end-effector during the estimation in previous studies and this study with motion range of 0.1. The lowest value in each direction is highlighted.
ItemsDirectionsMethod in
[13]
Method in
[18]
Method in
[26]
Proposed
Method
Displacement range x10712199.510
Linear (cm)y59.342.389.110
z12.222.854.710
x26101244530100
Angular (mrad)y16603883100100
z91412003710100
Maximum speed x12613923918.8
Linear (cm/s)y89.671.518618.8
z15.023.913118.8
x36701993640188
Angular (mrad/s)y23504245630188
z150012104570188
Maximum acceleration x226242123057.7
Linear (cm/s2)y25322590557.7
z31.525.693457.7
x952054820,900577
Angular (mrad/s2)y564083124,400577
z4770249021,800577
Table 8. Range of joint angles and maximum joint velocity and acceleration during the estimation in previous studies.
Table 8. Range of joint angles and maximum joint velocity and acceleration during the estimation in previous studies.
MotionJointMethod in
[13]
Method in
[18]
Method in
[26]
q 1 1.571.222.59
q 2 1.050.521.44
q 3 1.571.221.14
Range of joint angles (rad) q 4 3.140.701.17
q 5 1.572.443.24
q 6 1.57 1.84
q 7 0.84
q 1 2.942.292.21
q 2 1.960.982.21
q 3 2.942.291.20
Maximum joint velocity (rad/s) q 4 5.891.312.10
q 5 2.944.582.30
q 6 2.94 2.10
q 7 2.50
q 1 4.533.853.77
q 2 3.031.656.81
q 3 4.533.852.53
Maximum joint acceleration (rad/s2) q 4 9.062.207.54
q 5 4.537.703.26
q 6 4.53 4.80
q 7 14.8
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jang, J.; Park, J.H. A Method of Estimating an Object’s Parameters Based on Simplication with Momentum for a Manipulator. Appl. Sci. 2025, 15, 3989. https://doi.org/10.3390/app15073989

AMA Style

Jang J, Park JH. A Method of Estimating an Object’s Parameters Based on Simplication with Momentum for a Manipulator. Applied Sciences. 2025; 15(7):3989. https://doi.org/10.3390/app15073989

Chicago/Turabian Style

Jang, Jaeyoung, and Jong Hyeon Park. 2025. "A Method of Estimating an Object’s Parameters Based on Simplication with Momentum for a Manipulator" Applied Sciences 15, no. 7: 3989. https://doi.org/10.3390/app15073989

APA Style

Jang, J., & Park, J. H. (2025). A Method of Estimating an Object’s Parameters Based on Simplication with Momentum for a Manipulator. Applied Sciences, 15(7), 3989. https://doi.org/10.3390/app15073989

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop