CMPS 261 Data Visualization Winter 2019

Dongshuo Li, University of California Santa Cruz, Computer Engineering

Visualization of a Somatosensitive Actuator


About the Project:

Finger Creation:

The Machine Interaction Lab in University of California Santa Cruz has successfully developed molds for the Soft-Robotic-Gripper. The matertial to pour in the molds is the Dragon Skin 30. The whole process includes mixing the dragon skin in the mold, vacuuming the mold and bake the whole part in the oven at 60 Celsius degree for 10 minutes.

Motion Tracking:

OpenCV and python language is a good choice to achieve the motion tracking part. To run the code, Jupyter notebook is one of the good platform. It runs on a browser without considering building enviornment. All the software and library here are open-source. With both software installing, a video of the an actuator sticking some reference points is recorded as an input. A code that is written to by Python using OpenCV to run on Jupyter notebook has the feature to detect the reference point and extract the pixel coordinates of the center of the reference point as a csv file. For more information click Motion-Tracking-code OpenCV Jupyter Notebook.

Data Visualization

Matlab is one of the powerful tool to sort the data from csv file generated by OpenCV and do the data visualization. The whole script has two major parts.

  • Correctly sort the data because the code from previous part does not guarantee the correct point order in each frame
  • plot the animated motion traces of three reference points
  • The data visilizaiton code is here and should run in matlab. Data-Visualization

    Results of the Project

    The videos downwards are the input video and motion traces.

    User mannual

  • Download the video of actuator movement with reference point on that
  • Download the Jupyter Notebook and follow the instrutions on their website to install it
  • Download the OpenCV and follow the instrutions on their website to install it
  • If last two procedures work perfectly, you should run "Hello World" of Python on Jupyter Notebook; if not, you need to google it until it works
  • Download the Motion-Tracking-code and run it on Jupyter Notebook, make sure the input path is correctly linked to the video
  • If last procedure works perfectly, you should get a video file as the one in motion tracking part and a csv file called center point; if not, check video input path, try not to change the rest of the code
  • Download the Data-Visualization to run it in matlab, make sure the input csv file is correctly linked
  • You can also download the paper here here.

    Acknowledgement

    Great thanks to Max Belman and Jacqueline Clow in University of California Santa Cruz for their help on the motion tracking code compliation. Great thanks to Colin Martin for his help of video capturing of the actuator.