March 2, 2005

Cornell Robotics Team Drives for the Gold

Print More

The Cornell DARPA Grand Challenge team will participate in a vehicle race across rough desert terrain in Fontana, Calif. early next fall. Each team’s vehicle will be given 10 hours to traverse approximately 170 miles of a yet unknown race course abundant with bumpy terrain, rocks and other obstacles.

“Technically, by the rules, the team that completes [the race] the fastest will win the prize,” said Matt Grimm ’06, head of the team’s business sub-team. “Realistically speaking, whoever finishes will win. A percentage of vehicles will be out before they even start.”

The difficulty stems from the requirement that each vehicle be completely self-navigating, that is, possessing its own mechanism of interpreting the environment and deciding upon the most favorable path to pursue.

The event is sponsored by the Defense Advanced Research Projects Agency. The Cornell team has registered two vehicles in the competition — a compact dune buggy type vehicle nicknamed the “Titan” and a much larger vehicle nicknamed the “Code Red.”

Titan was “custom made for military use by Singapore Technologies Kinetics … and came ready to run,” Grimm said. “Code Red,” however, has been almost entirely built by the team’s members.

On Code Red, “we had to wire up the engine, install the brakes [and] install motors to control the steering … and brak[ing],” said Noah Zych ’06, leader of the mechanical sub-team.

Both vehicles will possess a series of sensors and powerful computers that together will interpret the environment around the vehicle and pass that information to a decision-making computer responsible for steering.

A Light Detection and Ranging (LIDAR) sensor emits electromagnetic waves just outside the visible spectrum that bounce off objects and return to the sensor. The time delay between the emission of the beam and its return provides a precise measure of the distance to an object.

Because its wavelength is close to that of visible light, “it essentially sees what a human can see,” said Aaron Nathan ’06, leader of the sensor sub-team.

Each vehicle will also carry RADAR sensors that emit waves of a longer wavelength and bounce back from a different class of objects, usually solid obstacles. If the vehicles are “caught in a dust storm, the radar can see through the dust” to detect obstacles, whereas LIDAR cannot, Nathan said.

Another sensor, called stereo-vision, consists of two cameras arranged parallel to each other much like human eyes. Together, the cameras provide images that can be compared to provide information similar to human depth perception.

“Stereo is just as critical as the other sensors … it gives you the three-dimensional picture the radar would, but it is much cleaner,” Nathan said. Unlike the other sensors, the stereo-vision is being completely developed by members of the team.

Information collected by the sensors is interpreted by software on attached computers. “We run the software to filter [the data] and get rid of the noise,” said Brian Schimpf ’06, leader of the computer science sub-team. The filtered data then has to be combined into a model of the external environment, which can be more readily interpreted by the vehicle’s steering software.

“The combination of all those sensors is really challenging … nothing has really been done like it before,” Schimpf said.

The steering software, also referred to as the A.I. is still under development, and will determine the desired path of the vehicle, via a path-finding algorithm, and dispatch velocity and direction instructions accordingly.

“We are talking about a couple of ideas for the path-finding,” Schimpf said. “An algorithm called D* associates a cost with each cell in the world, and follows the path with the lowest cost.”

Unfortunately, D* occasionally produces paths that the vehicle cannot actually follow, and so team members are working to adapt the algorithm to generate more realistic paths. The final versions of the A.I. will also possess certain decision-making capabilities that will aid the vehicles in unusual circumstances. For instance, if a vehicle hits a rock, the A.I. should be able to determine this and put the vehicle in reverse, Schimpf said.

Archived article by David Andrade
Sun Staff Writer