Another major component for AERO came in today. Through our awesome sponsors KVH, we have a 1750 Fiber Optic Ring Gryo (FOG) IMU. You may wonder what is so cool about a FOG IMU. It uses laser light traveling in opposite directions in a loop of fiber optic cable to measure the speed of rotation about a given axis. FOGs have excellent stability (their output does not vary much with temperature, altitutde, etc.) and they have very low drift rates. The accelerations provided by the IMU comes from three MEMs accelerometers very similar to ones in consumer GPS systems, but significantly more accurate. With this sensor, we will be able to accurately navigate even without the presence of LIDAR or visual features to guide us. For more info on FOGs, visit Wikipedia.
Today, we made our wood block, orange PVC pipe, and pre-cached samples. We will be able to test our vision algorithms with these samples. The red hockey puck and pink tennis ball are unaltered, but because of their distinctive shapes and colors, we should be able to accurately locate and identify them.
A few days ago, we got a surprise in the lab. Our Nvidia Tesla K20 showed up about 10 days earlier than we expected. We mounted it in the computer and installed the Nvidia CUDA toolkit. Running the bandwidth test application in the toolkit showed we are averaging around 6 GB/s transfers between the processors and the Tesla.
After a quick pressure test with a bicycle pump and changing a fitting that wouldn’t seal well, we filled the water cooling system. The system took around 700 ml of coolant, more than expected, but much of that is probably in the radiator. After fighting with the BIOS on the server motherboard, we got Ubuntu 12.04 installed. The processors idle at around 37° C, and running mprime on 30 threads leads to ~50° C, so the water cooling appears to be working well. These values should get better on the actual robot since airflow through the radiator will be better.
The computer is almost ready! See more pictures below of the process of putting everything together.
The computer will temporarily be housed in the 4U server chassis, while we work on making a top plate for the Husky. Who knew that a 4U server chassis is significantly cheaper (and significantly bigger) than a full size ATX case?
This is our tripod mounted test setup for our stereo cameras. We selected to use two sets of Allied Vision Manta G-095C cameras in a stereo arrangement. One set of stereo cameras will be fixed facing forward with relatively wide field of view to help identify samples and obstacles in front of AERO. A second set of stereo cameras will be mounted on a high mast above the robot with a narrow field of view to scan a wide range for samples.
We selected the Manta G-095C for several reasons. We wanted Gig-E capable cameras to make interfacing with the main computer simple and effective. The cameras support hardware based triggering so we can synchronize both cameras shutters in the stereo arrangement. This helps reduce matching errors when we extract a depth field due to movement of the robot between frames. Finally, the Sony EXview HAD II ICX692 sensor has a very wide dynamic range with good sensitivity. Perfect for outdoor applications!
Look forward to a post in a few days about the synchronization board we are building to synchronize the cameras using their I/O connectors.