Tracking Deformable Objects with Point Clouds

Tracking Deformable Objects with Point Clouds

John Schulman, Alex Lee, Jonathan Ho, Pieter Abbeel

[PDF]



Welcome! This website supplements our ICRA 2013 submission, in which we present an algorithm for tracking deformable objects from a sequence of point clouds.

Quick sampler (4X speed).

Left pane: Asus RGB Image. Right pane: rendering of state estimate.
The towel tracking example is from our ground truth dataset, so the towel has markers on it. However, the markers are not used in the tracking.


Download


Many more videos, which were collected with our ground truth data, are available here.


Dataset

We used the ROS framework to run our experiments, and we recorded our data into bag files. Each task has some bag files associated with it:

[task name]_raw.bag Original task execution.
[task name]_preprocessor.bag Preprocessed task execution. Filters out the green points and the points outside a manually defined polygon.
[task name]_tracker_[experiment type].bag Tracking results. Has the estimated state and the ground truth positions.

These bagfiles have topics as described below:


Bag file Topics Topic types Description
[task name]_raw.bag
/drop/kinect1/points
/drop/kinect2/points
sensor_msgs/PointCloud2 Original point clouds at 15 Hz
/phasespace_markers bulletsim_msgs/OWLPhasespace Ground truth positions in world coordinates
/tf tf/tfMessage Transforms between the cameras and world coordinates
[task name]_preprocessor.bag
/preprocessor/kinect1/points
/preprocessor/kinect2/points
sensor_msgs/PointCloud2 Filtered point clouds. Green points and points outside a polygon were filtered out.
/preprocessor/kinect1/image
/preprocessor/kinect1/depth
/preprocessor/kinect2/image
/preprocessor/kinect2/depth
sensor_msgs/Image RGBA and depth images of the cameras. The "A" field of the RGBA image is a mask of the green points that were filtered out.
/preprocessor/kinect1/polygon
/preprocessor/kinect2/polygon
geometry_msgs/PolygonStamped Manually defined polygon used to filter out a sub-space of the point clouds.
/phasespace_markers bulletsim_msgs/OWLPhasespace Ground truth positions in world coordinates
/tf tf/tfMessage Transforms between the cameras and world coordinates
[task name]_tracker_[experiment type].bag
/tracker/object bulletsim_msgs/TrackedObject Positions of the nodes of the object being tracked
/phasespace_markers bulletsim_msgs/OWLPhasespace Ground truth positions in world coordinates
/tf tf/tfMessage Transforms between the cameras and world coordinates

Download the cloth dataset (7.5G)

Download the rope dataset (12G)

Download the robot rope dataset (12G)

Source code

Source code for the tracking algorithm is available on github

The code is not really suitable for public consumption right now. If you are interested in using it, please send John and Alex an email saying that you'd like to use the code and what you plan to do with it, and we'll try to help you out.
Last modified: