Custom TOF Gripper to Improve Grasping Robustness
This research project is ongoing, the below information is not final and will be updated periodically.
Current deep-learning grasp planners are very good at predicting successful grasps, however they are not perfect and are susceptible to declining accuracy in situations with noisy sensors, poor lighting, or external disturbances. Previous work in the field has attempted to bridge this gap with tactile sensors - after grasping the object they predict if the grasp is "good" by some metric (usually the robot can maintain the grasp for a certain period of time or move an object to a target location).
I propose a method using multi-zone time-of-flight sensors mounted in the distal links of each finger. After moving to a grasp pose, grasp outcome is predicted without closing the gripper and contacting the object.
This project is composed of multiple pieces, each of which are described in more detail in the following sections.
Custom gripper design
Mechanical design/assembly
Electrical, wiring, and control
Data collection
Machine learning classifier
Over the course of this project I have applied the following skills:
Programming/software: Python, C++, ROS, ROS 2, Docker, Linux, Git, Machine Learning
Robotics: ROS, 7 DOF arm control and planning, custom URDF/SDF
Mechanical: CAD, prototyping, 3D printing
Electrical: Raspberry Pi, sensor wiring/integration
Custom Gripper Design
I designed the gripper in Solidworks and 3D printed the body and finger links. Silicone is cast on the distal links for higher friction, deformable finger pads. The hand is fully-actuated, there are two joints and two motors for each finger.
The motors are Dynamixel XL-330 servos and the time-of-flight sensors are ST VL53L7CX 8x8 zone. Additionally, IMUs are mounted in the finger tips (not used in this project at this time). A Raspberry Pi in the palm of the gripper reads sensor data and controls the motors via custom ROS 2 nodes written in Python and C++.
Data Collection
I collected nearly 3,000 grasps with a variety of objects, including custom 3D printed shapes and objects from the YCB set. For each object, random grasp poses were sampled with 6 axis pose modifications.
For each grasp, the object is lifted and moved approximately 20 cm to a drop off point. If the object remains in the gripper for the duration of the trial, it is marked as a success.
Machine Learning Classifier
Currently in-progress
I am exploring using both neural networks and random forests for classification. Following standard post-processing and optimization, accuracy on both networks is in the 85-90% range across fully unseen test objects.