Cable Tracing with Tactile Sensors and Learning Based Pose Estimation
For the class project in Learning Based Control (ROB 537), I worked with Jostan Brown, Keegan Nave, and Marcus Rosette to trace a cable using machine learning. We used a custom parallel jaw gripper with two tactile sensor arrays on the fingertips. Our approach is outlined below:
1
We model the local pose of the cable using a circle with a radius, R, and the two entry and exit points (x, y).
2
We collected ground truth data (approx 800 trials) for training using a custom jig and an overhead camera to capture curvature and position.
3
We trained a Long Short-Term Memory network to predict the cable pose from tactile sensor data as the gripper closes.
4
We mounted the gripper on a UR5e arm, and validated our approach to cable tracing.Â

Final Result
We successfully followed a cable in various shapes in five out of six validation trials. One trial is shown on the left.
My Contributions
As this was a group project, we shared the work. My primary contributions were the data collection code and jig, as well as the real world controller. Additionally, I supported our machine learning approaches, testing various networks and parameters. The skills I applied throughout this project include:
Software: Python, ROS 1, TensorFlow, OpenCV, Docker
Robotics: Control of a UR5e arm, Dynamixel motors, URDFs, transforms, Contactille sensor array
Mechanical: Design, CAD, 3D printings
For more information about this project, our final report is embedded below.
