Next Article in Journal
Colored 3D Path Extraction Based on Depth-RGB Sensor for Welding Robot Trajectory Generation
Next Article in Special Issue
Predictive Performance of Mobile Vis–NIR Spectroscopy for Mapping Key Fertility Attributes in Tropical Soils through Local Models Using PLS and ANN
Previous Article in Journal
A New Transformer-Less Structure for a Boost DC-DC Converter with Suitable Voltage Stress
Previous Article in Special Issue
Design of Tendon-Actuated Robotic Glove Integrated with Optical Fiber Force Myography Sensor
Communication

Design and Implementation of a Robotic Arm Assistant with Voice Interaction Using Machine Vision

Department of Electrical and Computer Engineering, University of Western Macedonia, 501 00 Kozani, Greece
*
Authors to whom correspondence should be addressed.
Academic Editor: Felipe N. Martins
Automation 2021, 2(4), 238-251; https://doi.org/10.3390/automation2040015
Received: 31 August 2021 / Revised: 19 October 2021 / Accepted: 26 October 2021 / Published: 31 October 2021
(This article belongs to the Collection Smart Robotics for Automation)
It is evident that the technological growth of the last few decades has signaled the development of several application domains. One application domain that has expanded massively in recent years is robotics. The usage and spread of robotic systems in commercial and non-commercial environments resulted in increased productivity, efficiency, and higher quality of life. Many researchers have developed systems that improve many aspects of people’s lives, based on robotics. Most of the engineers use high-cost robotic arms, which are usually out of the reach of typical consumers. We fill this gap by presenting a low-cost and high-accuracy project to be used as a robotic assistant for every consumer. Our project aims to further improve people’s quality of life, and more specifically people with physical and mobility impairments. The robotic system is based on the Niryo-One robotic arm, equipped with a USB (Universal Serial Bus) HD (High Definition) camera on the end-effector. To achieve high accuracy, we modified the YOLO algorithm by adding novel features and additional computations to be used in the kinematic model. We evaluated the proposed system by conducting experiments using PhD students of our laboratory and demonstrated its effectiveness. The experimental results indicate that the robotic arm can detect and deliver the requested object in a timely manner with a 96.66% accuracy. View Full-Text
Keywords: robotics; robotic arm; niryo-one; machine vision; YOLO; opencv; assistance robot robotics; robotic arm; niryo-one; machine vision; YOLO; opencv; assistance robot
Show Figures

Figure 1

MDPI and ACS Style

Nantzios, G.; Baras, N.; Dasygenis, M. Design and Implementation of a Robotic Arm Assistant with Voice Interaction Using Machine Vision. Automation 2021, 2, 238-251. https://doi.org/10.3390/automation2040015

AMA Style

Nantzios G, Baras N, Dasygenis M. Design and Implementation of a Robotic Arm Assistant with Voice Interaction Using Machine Vision. Automation. 2021; 2(4):238-251. https://doi.org/10.3390/automation2040015

Chicago/Turabian Style

Nantzios, George, Nikolaos Baras, and Minas Dasygenis. 2021. "Design and Implementation of a Robotic Arm Assistant with Voice Interaction Using Machine Vision" Automation 2, no. 4: 238-251. https://doi.org/10.3390/automation2040015

Find Other Styles

Article Access Map by Country/Region

1
Back to TopTop