Robotics

The work presented here is in partial fulfillment of the requirements for ME5286: Robotics Lab at the University of Minnesota, Twin Cities.

Codes are developed (will be published soon!) and run on a Universal Robotics UR5 collaboration robot at the Robotics lab.

Pick-and-place

A convolutional neural network (CNN) model with provided dataset has been built and trained to differentiate four tools: plier, hammer, screwdriver, and wrench.

As shown in this video, the model is then used in conjunction with a self-written Python script that activates the camera on UR5 cobot, takes pictures using the camera of the randomized located tools, passes through the CNN model, infers the tool, then proceeds to control UR5 and the Robotiq gripper using the RoboDK Python API to pick it up and place it into the sorted bin.

Demo: Pin-Hao Cheng, Andrew Alegria

Flashlight assembly

In this video, we demonstrated the use of a UR5 robot and a pneumatic chuck to assemble a flashlight. We have programmed the robot using the RoboDK Python API to assemble a flashlight.

Demo: Pin-Hao Cheng, Voice over: Andrew Alegria