Data-Driven Robotic Tactile Grasping for Hyper-Personalization Line Pick-and-Place
Abstract
:1. Introduction
2. Literature Review
2.1. 3D Grasping Pose Sampling Benchmarks
2.2. Soft Gripping Technology
2.3. Tactile Sensors and Slipping Detection
2.4. Data-Driven Approach
2.5. Deep Reinforcement Learning Approach
3. Methodologies
- (1)
- Hyper-personalization line (HPL) software pipeline;
- (2)
- ROI filtering based on Mask R-CNN;
- (3)
- Force feedback on grasping stability and success rate;
- (4)
- PointNetGPD grasp pose estimation using raw point cloud data.
3.1. HPL Software Pipeline
3.2. Mask R-CNN Based Region of Interest (ROI) Filtering
3.3. Sensor Feedback on Grasping Success and Stability
- Sensor calibration;
- Actual use condition, grip stability, and slippage;
- Sensor placement for accurate position of gripping contact.
3.4. Robotic Grasping Based on PointNetGPD
3.5. Adaptive Anti-Slip Force Control
3.6. Trajectory Planning
4. Result Analysis
5. Future Works
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Bormann, R.; de Brito, B.F.; Lindermayr, J.; Omainska, M.; Patel, M. Towards automated order picking robots for warehouses and retail. In Proceedings of the Computer Vision Systems: 12th International Conference, ICVS 2019, Thessaloniki, Greece, 23–25 September 2019; pp. 185–198. [Google Scholar]
- Xie, Z.; Somani, N.; Tan, Y.J.S.; Seng, J.C.Y. Automatic Toolpath Pattern Recommendation for Various Industrial Applications based on Deep Learning. In Proceedings of the 2021 IEEE/SICE International Symposium on System Integration (SII), Narvik, Norway, 8–12 January 2012; pp. 60–65. [Google Scholar]
- Zhen, X.; Seng, J.C.Y.; Somani, N. Adaptive Automatic Robot Tool Path Generation Based on Point Cloud Projection Algorithm. In Proceedings of the 2019 24th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Zaragoza, Spain, 10–13 September 2019; pp. 341–347. [Google Scholar]
- Liang, H.; Ma, X.; Li, S.; Görner, M.; Tang, S.; Fang, B.; Sun, F.; Zhang, J. Pointnetgpd: Detecting grasp configurations from point sets. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 3629–3635. [Google Scholar]
- Zeng, A.; Song, S.; Welker, S.; Lee, J.; Rodriguez, A.; Funkhouser, T. Learning synergies between pushing and grasping with self-supervised deep reinforcement learning. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4238–4245. [Google Scholar]
- Xie, Z.; Seng, J.C.Y.; Lim, G. AI-Enabled Soft Versatile Grasping for High-Mixed-Low-Volume Applications with Tactile Feedbacks. In Proceedings of the 2022 27th International Conference on Automation and Computing (ICAC), Bristol, UK, 1–3 September 2022; pp. 1–6. [Google Scholar]
- Xie, Z.; Liang, X.; Roberto, C. Learning-based Robotic Grasping: A Review. Front. Robot. AI 2023, 10, 1038658. [Google Scholar] [CrossRef] [PubMed]
- Cao, H.; Fang, H.-S.; Liu, W.; Lu, C.J.I.R.; Letters, A. Suctionnet-1billion: A large-scale benchmark for suction grasping. IEEE Robot. Autom. Lett. 2021, 6, 8718–8725. [Google Scholar] [CrossRef]
- Jiang, Y.; Moseson, S.; Saxena, A. Efficient grasping from rgbd images: Learning using a new rectangle representation. In Proceedings of the 2011 IEEE International conference on robotics and automation, Shanghai, China, 9–13 May 2011; pp. 3304–3311. [Google Scholar]
- Goldfeder, C.; Ciocarlie, M.; Dang, H.; Allen, P.K. The columbia grasp database. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; pp. 1710–1716. [Google Scholar]
- Chao, Y.-W.; Yang, W.; Xiang, Y.; Molchanov, P.; Handa, A.; Tremblay, J.; Narang, Y.S.; Van Wyk, K.; Iqbal, U.; Birchfield, S. DexYCB: A benchmark for capturing hand grasping of objects. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 9044–9053. [Google Scholar]
- Mahler, J.; Matl, M.; Liu, X.; Li, A.; Gealy, D.; Goldberg, K. Dex-net 3.0: Computing robust vacuum suction grasp targets in point clouds using a new analytic model and deep learning. In Proceedings of the 2018 IEEE International Conference on robotics and automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 5620–5627. [Google Scholar]
- Mousavian, A.; Eppner, C.; Fox, D. 6-dof graspnet: Variational grasp generation for object manipulation. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Seoul, Korea, 27 October–2 November 2019; pp. 2901–2910. [Google Scholar]
- Fang, H.; Fang, H.-S.; Xu, S.; Lu, C.J.a.p.a. TransCG: A Large-Scale Real-World Dataset for Transparent Object Depth Completion and Grasping. IEEE Robot. Autom. Lett. 2022, 7, 7383–7390. [Google Scholar] [CrossRef]
- Shintake, J.; Cacucciolo, V.; Floreano, D.; Shea, H.J.A.m. Soft robotic grippers. Adv. Mater. 2018, 30, 1707035. [Google Scholar] [CrossRef] [PubMed]
- Goncalves, A.; Kuppuswamy, N.; Beaulieu, A.; Uttamchandani, A.; Tsui, K.M.; Alspach, A. Punyo-1: Soft tactile-sensing upper-body robot for large object manipulation and physical human interaction. In Proceedings of the 2022 IEEE 5th International Conference on Soft Robotics (RoboSoft), Edinburgh, UK, 4–8 April 2022; pp. 844–851. [Google Scholar]
- Lambeta, M.; Chou, P.-W.; Tian, S.; Yang, B.; Maloon, B.; Most, V.R.; Stroud, D.; Santos, R.; Byagowi, A.; Kammerer, G.; et al. DIGIT: A Novel Design for a Low-Cost Compact High-Resolution Tactile Sensor With Application to In-Hand Manipulation. IEEE Robot. Autom. Lett. 2020, 5, 3838–3845. [Google Scholar] [CrossRef]
- Fishel, J.A.; Loeb, G.E. Sensing tactile microvibrations with the BioTac—Comparison with human sensitivity. In Proceedings of the 2012 4th IEEE RAS & EMBS international conference on biomedical robotics and biomechatronics (BioRob), Rome, Italy, 24–27 June 2012; pp. 1122–1127. [Google Scholar]
- Mittendorfer, P.; Cheng, G.J.I.T.o.r. Humanoid multimodal tactile-sensing modules. IEEE Trans. Robot. 2011, 27, 401–410. [Google Scholar] [CrossRef]
- Cavallo, A.; Costanzo, M.; De Maria, G.; Natale, C. Modeling and slipping control of a planar slider. Automatica 2020, 115, 108875. [Google Scholar] [CrossRef]
- Costanzo, M.; Maria, G.D.; Natale, C. Slipping Control Algorithms for Object Manipulation with Sensorized Parallel Grippers. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 7455–7461. [Google Scholar]
- Huang, S.-J.; Chang, W.-H.; Su, J.-Y. Intelligent robotic gripper with adaptive grasping force. Int. J. Control. Autom. Syst. 2017, 15, 2272–2282. [Google Scholar] [CrossRef]
- Deng, Z.; Jonetzko, Y.; Zhang, L.; Zhang, J. Grasping Force Control of Multi-Fingered Robotic Hands through Tactile Sensing for Object Stabilization. Sensors 2020, 20, 1050. [Google Scholar] [CrossRef]
- Fischinger, D.; Vincze, M.; Jiang, Y. Learning grasps for unknown objects in cluttered scenes. In Proceedings of the 2013 IEEE international conference on robotics and automation, Karlsruhe, Germany, 6–10 May 2013; pp. 609–616. [Google Scholar]
- Schmidt, P.; Vahrenkamp, N.; Wächter, M.; Asfour, T. Grasping of unknown objects using deep convolutional neural networks based on depth images. In Proceedings of the 2018 IEEE international conference on robotics and automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 6831–6838. [Google Scholar]
- Levine, S.; Pastor, P.; Krizhevsky, A.; Ibarz, J.; Quillen, D. Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection. Int. J. Robot. Res. 2017, 37, 421–436. [Google Scholar] [CrossRef]
- Varley, J.; Weisz, J.; Weiss, J.; Allen, P. Generating multi-fingered robotic grasps via deep learning. In Proceedings of the 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 4415–4420. [Google Scholar]
- Ma, B.; Li, X.; Xia, Y.; Zhang, Y. Autonomous deep learning: A genetic DCNN designer for image classification. Neurocomputing 2020, 379, 152–161. [Google Scholar] [CrossRef]
- Xie, Z.; Zhong, Z.W. Unmanned Vehicle Path Optimization Based on Markov Chain Monte Carlo Methods. Appl. Mech. Mater. 2016, 829, 133–136. [Google Scholar] [CrossRef]
- Bohg, J.; Morales, A.; Asfour, T.; Kragic, D. Data-Driven Grasp Synthesis—A Survey. IEEE Trans. Robot. 2014, 30, 289–309. [Google Scholar] [CrossRef]
- Fang, H.-S.; Wang, C.; Gou, M.; Lu, C. Graspnet-1billion: A large-scale benchmark for general object grasping. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, Seattle, WA, USA, 13–19 June 2020; pp. 11444–11453. [Google Scholar]
- Rajeswaran, A.; Kumar, V.; Gupta, A.; Vezzani, G.; Schulman, J.; Todorov, E.; Levine, S.J.a.p.a. Learning complex dexterous manipulation with deep reinforcement learning and demonstrations. arXiv 2017, arXiv:1709.10087. [Google Scholar]
- Wu, B.; Akinola, I.; Allen, P.K. Pixel-attentive policy gradient for multi-fingered grasping in cluttered scenes. In Proceedings of the 2019 IEEE/RSJ international conference on intelligent robots and systems (IROS), Macau, China, 3–8 November 2019; pp. 1789–1796. [Google Scholar]
- Liu, R.; Nageotte, F.; Zanne, P.; de Mathelin, M.; Dresp-Langley, B. Deep Reinforcement Learning for the Control of Robotic Manipulation: A Focussed Mini-Review. Robotics 2021, 10, 22. [Google Scholar] [CrossRef]
- Kleeberger, K.; Bormann, R.; Kraus, W.; Huber, M.F. A survey on learning-based robotic grasping. Curr. Robot. Rep. 2020, 1, 239–249. [Google Scholar] [CrossRef]
- Kopicki, M.; Detry, R.; Schmidt, F.; Borst, C.; Stolkin, R.; Wyatt, J.L. Learning dexterous grasps that generalise to novel objects by combining hand and contact models. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–07 June 2014; pp. 5358–5365. [Google Scholar]
- Bhagat, S.; Banerjee, H.; Ho Tse, Z.T.; Ren, H. Deep reinforcement learning for soft, flexible robots: Brief review with impending challenges. Robotics 2019, 8, 4. [Google Scholar] [CrossRef]
- Wu, B.; Akinola, I.; Varley, J.; Allen, P. Mat: Multi-fingered adaptive tactile grasping via deep reinforcement learning. arXiv 2019, arXiv:1909.04787. [Google Scholar]
- Mohammed, M.Q.; Chung, K.L.; Chyi, C.S. Review of Deep Reinforcement Learning-Based Object Grasping: Techniques, Open Challenges, and Recommendations. IEEE Access 2020, 8, 178450–178481. [Google Scholar] [CrossRef]
- Nian, R.; Liu, J.; Huang, B. A review On reinforcement learning: Introduction and applications in industrial process control. Comput. Chem. Eng. 2020, 139, 106886. [Google Scholar] [CrossRef]
- Patnaik, S. New Paradigm of Industry 4.0; Springer: Manhattan, NY, USA, 2020. [Google Scholar]
- François-Lavet, V.; Henderson, P.; Islam, R.; Bellemare, M.G.; Pineau, J. An introduction to deep reinforcement learning. arXiv 2018, arXiv:181112560. [Google Scholar]
- Hays, T.; Keskinocak, P.; De López, V.M. Strategies and challenges of internet grocery retailing logistics. Appl. Supply Chain. Manag. E-Commer. Res. 2005, 92, 217–252. [Google Scholar]
- Cano, J.A.; Correa-Espinal, A.A.; Gómez-Montoya, R.A.J. Management. An evaluation of picking routing policies to improve warehouse efficiency. Int. J. Ind. Eng. Manag. 2017, 8, 229. [Google Scholar]
- Wadhwa, R.S. Flexibility in manufacturing automation: A living lab case study of Norwegian metalcasting SMEs. J. Manuf. Syst. 2012, 31, 444–454. [Google Scholar] [CrossRef]
- He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask r-cnn. In Proceedings of the IEEE international conference on computer vision, Venice, Italy, 22–29 October 2017; pp. 2961–2969. [Google Scholar]
- Lin, T.-Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C.L. Microsoft Coco: Common Objects in Context; Lecture Notes in Computer Science; Springer: Manhattan, NY, USA, 2014; pp. 740–755. [Google Scholar]
- Xie, Z.; Zhong, Z. Visual Conspicuity Measurements of Approaching and Departing Civil Airplanes using Lab Simulations. Int. J. Simul. Syst. Sci. Technol. 2016, 17, 81–86. [Google Scholar]
- Xie, Z.; Zhong, Z. Simulated civil airplane visual conspicuity experiments during approaching and departure in the airport vicinity. In Proceedings of the 2016 7th International Conference on Intelligent Systems, Modelling and Simulation (ISMS), Bangkok, Thailand, 25–27 January 2016; pp. 279–282. [Google Scholar]
- Schaller, R. Moore’s law: Past, present and future. IEEE Spectr. 1997, 34, 52–59. [Google Scholar] [CrossRef]
- Mahler, J.; Matl, M.; Satish, V.; Danielczuk, M.; DeRose, B.; McKinley, S.; Goldberg, K. Learning ambidextrous robot grasping policies. Sci. Robot. 2019, 4, eaau4984. [Google Scholar] [CrossRef]
- Sucan, I.A.; Moll, M.; Kavraki, L.E. The Open Motion Planning Library. IEEE Robot. Autom. Mag. 2012, 19, 72–82. [Google Scholar] [CrossRef]
- Bohlin, R.; Kavraki, L.E. Path planning using lazy PRM. In Proceedings of the 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No. 00CH37065), Francisco, CA, USA, 24–28 April 2000; pp. 521–528. [Google Scholar]
- Kavraki, L.; Svestka, P.; Latombe, J.-C.; Overmars, M. Probabilistic roadmaps for path planning in high-dimensional configuration spaces. IEEE Trans. Robot. Autom. 1996, 12, 566–580. [Google Scholar] [CrossRef]
- Sánchez, G.; Latombe, J.-C. A Single-Query Bi-Directional Probabilistic Roadmap Planner with Lazy Collision Checking. In Proceedings of the International Symposium of Robotic Research, Lorne, Victoria, Australia, 9–12 November 2001; pp. 403–417. [Google Scholar]
- Sucan, I.A.; Kavraki, L.E. Kinodynamic motion planning by interior-exterior cell exploration. Algorithmic Found. Robot. VIII 2009, 57, 449–464. [Google Scholar]
- Xie, Z.; Zhong, Z.W. Changi Airport Passenger Volume Forecasting Based On An Artificial Neural Network. Far East J. Electron. Commun. 2016, 2, 163–170. [Google Scholar] [CrossRef]
- Bormann, R.; Wang, X.; Völk, M.; Kleeberger, K.; Lindermayr, J. Real-time Instance Detection with Fast Incremental Learning. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 13056–13063. [Google Scholar]
- Liang, R.; Wang, Y.; Xie, Z.; Musaoglu, S. A Vacuum-Powered Soft Robotic Gripper for Circular Objects Picking. In Proceedings of the 2023 6th IEEE-RAS International Conference on Soft Robotics (RoboSoft 2023), Singapore, 3–7 April 2023. [Google Scholar]
Type/Brand | Payload | Grasping Objects | Tech Specs |
---|---|---|---|
Soft Robotics Inc. mGrip | Circular: diameter up to 145 mm × Parallel: width up to 100 mm, up to 3.4 kg | Food products of various sizes, shapes | Operating pressure −5 to 10 psi, IP67, weight 334 g(2-finger), weight 577 g(4-finger) |
OnRobot Soft Gripper | 2.2 kg max | Delicate/fragile objects, irregular shapes | 0.77 + 0.168 kg total weight (base + attachment), work range: 11–75 mm (grip dimension) |
Soft Robot Tech Co. (SRT) SFG | 250–1550 g | Abnormal/fragile objects | Different models can be chosen according to weight/dimension specs |
UBIROS Gentle Duo | 1.5 kg (3.3 lbs) payload | Delicate objects or products that vary in size, shape, and weight during primary packaging | Fully electrical, weight 800 g (1.8 lb.), finger distance (base) 28 mm (1.1 in) |
FESTO DHEF | 1 kg | delicate products, undefined shapes | weight 475 g, position sensing accessories available |
Rochu Soft Robotic Gripper | Catalog different finger modules that provide different max load | Food, textile, glass, logistics, and electronics | Pressure used; the weight of the gripper depends on the model used. |
Piab piSOFTGRIP | 250 g, 10–30 mm objects | Suitable for sensitive and lightweight objects of odd geometries and/or an unusual surface | Weight 170 g |
Roplus UnisoGrip | Up to 2.5 kg, 10–30 cm gripping width | FMCG products | UR/ABB arm compatible, product weight 2.5 kg |
SoftGripping by Wegard GmbH SoftGripper | 200–400 g | Rectangular, cubic objects, 10–90 mm | SoftGripper: 120 mm height, 150 mm diameter |
Applied robotics Flexible Smart Gripper | 1.5 kg | Delicate objects | Fully electric, compatible with collaborative robots, Operating temperature: 5–60 °C, enhanced version provides force control |
Gripwiq SofTouch | 200 g to 6 kg | Packaged goods, food | 4 fingers principle, round, Ø 40–220 mm, 2 fingers principle, square, 40–250 mm width, and 100–350 mm length |
The Gripper Company 4-Finger Lip | 25–96 mm diameter objects | Fragile objects | Max push/pull 100 N |
Schunk ADHESO | N.A. | Fragile objects, electronics | UR compatible, adhesion based |
Type | Classification | Advantages | Disadvantages |
---|---|---|---|
Resistive | Normal pressure | Good sensitivity; low cost; simple; flexibility | Generally detects single contact; high power consumption |
Capacitive | Normal pressure | Excellent sensitivity; good spatial resolution; large dynamic range | Hysteresis; the complexity of measurement electronics |
Piezoresistive | Skin deformation | High spatial resolution; high scanning rate in a mesh; structured sensors | Lower repeatability; hysteresis; temperature sensitive |
Piezoelectric | Skin deformation | High-frequency response; high sensitivity; high dynamic range | Poor spatial resolution; dynamic sensing only; temperature sensitive |
Magnetic | Skin deformation | High sensitivity; good dynamic range; no mechanical hysteresis; physical robustness | Suffer from magnetic interference; complex computations; high power consumption |
Fiber optics | Skin deformation | Flexibility; small size; high sensitivity; biocompatibility; remote sensing capability | Fragility; relatively high cost of initial investment of interrogator |
Ultrasound | Skin deformation | Fast dynamic response; good force resolution | Complex electronics; temperature sensitive; limited utility at low frequency |
Grasping Methods | Total Attempts | Failure | 2nd Attempts | Success Rate | Computation Time |
---|---|---|---|---|---|
Soft grasping on familiar products with anti-slip control (FMCG) | 120 | 38 | 15 | 80.8% | 11.5 s |
Soft grasping on familiar products without anti-slip control (FMCG) | 120 | 54 | 12 | 65% | 11.7 s |
Soft grasping on unfamiliar products with anti-slip control (veggies and fruit) | 120 | 51 | 17 | 70.8% | 13.2s |
Visual pushing grasping (grasping) | 120 | 62 | 19 | 64.2% | 17.8 s |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xie, Z.; Chen, J.Y.S.; Lim, G.W.; Bai, F. Data-Driven Robotic Tactile Grasping for Hyper-Personalization Line Pick-and-Place. Actuators 2023, 12, 192. https://doi.org/10.3390/act12050192
Xie Z, Chen JYS, Lim GW, Bai F. Data-Driven Robotic Tactile Grasping for Hyper-Personalization Line Pick-and-Place. Actuators. 2023; 12(5):192. https://doi.org/10.3390/act12050192
Chicago/Turabian StyleXie, Zhen, Josh Ye Seng Chen, Guo Wei Lim, and Fengjun Bai. 2023. "Data-Driven Robotic Tactile Grasping for Hyper-Personalization Line Pick-and-Place" Actuators 12, no. 5: 192. https://doi.org/10.3390/act12050192
APA StyleXie, Z., Chen, J. Y. S., Lim, G. W., & Bai, F. (2023). Data-Driven Robotic Tactile Grasping for Hyper-Personalization Line Pick-and-Place. Actuators, 12(5), 192. https://doi.org/10.3390/act12050192