Volume 23, Issue 5 (May 2023)                   Modares Mechanical Engineering 2023, 23(5): 307-316 | Back to browse issues page


XML Persian Abstract Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Heidari H, Ghahri Saremi T, Ghahri Saremi T. Learning of the 5-finger robot hand using deep learning for stable grasping. Modares Mechanical Engineering 2023; 23 (5) :307-316
URL: http://mme.modares.ac.ir/article-15-66532-en.html
1- Associate Professor , hr.heidari@malayeru.ac.ir
2- Msc. Student
Abstract:   (889 Views)
The human hand is one of the most complex organs of the human body, capable of performing skilled tasks. Manipulation, especially grasping is a critical ability for robots. However, grasping objects by a robot hand is a challenging issue. Many researchers have used deep learning and computer vision methods to solve this problem. This paper presents a humanoid 5-degree-of-freedom robot hand for grasping objects. The robotic hand is made using a 3D printer and 5 servo motors are used to move the fingers. In order to simplify the robotic hand, a tendon-based transmission system was chosen that allows the robot's fingers to flexion and extension. The purpose of this article is to use deep learning algorithm to grasping different objects semi-automatically. In this regard, a convolutional neural network structure is trained with more than 600 images. These images were collected by a camera mounted on the robot's hand. Then, the performance of this algorithm is tested on different objects in similar conditions. Finally, the robot hand is able of successfully grasping with 85% accuracy.
 
Full-Text [PDF 775 kb]   (364 Downloads)    
Article Type: Original Research | Subject: Mechatronics
Received: 2023/01/3 | Accepted: 2023/03/29 | Published: 2023/04/30

Add your comments about this article : Your username or Email:
CAPTCHA

Send email to the article author


Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.