Visual Odometry

We hope to augment our navigation with visual odometry in time for the competition. The algorithm will run on the top, wide baselength stereo cameras. Here are some initial results.

Where is the hook?

We use the lower pair of stereo cameras to identify objects and find out how far away they are. The stereo cameras work just like your eyes. Each camera sees a slightly shifted image, and by calculating how much objects shift in our field of view, we can tell how far away there are. Using that information, we generate a disparity map which is just a color coded image of how far things are away from us. Can you spot the hook?

narrow_stereo_disp_hook

Platform Test

This is the video from our successful test driving off the platform and finding the sample. Getting off the platform proved a bit tricky initially since the robot sees the ground as an obstacle in front of it!

Autonomous Sample Pickup

We are testing the autonomous sample pickup. So far so good! The robot initializes, detects the sample, and picks it up. All without human intervention (except for a few breakpoints to let us check how things are working).