Nvidia Looks to Speed Up Virtual Self-Driving Testing

The autonomous driving technology that Nvidia introduced at its GPU Technology Conference on Tuesday will support the hidden side of driverless car development: billions of miles of testing that occurs in simulation, not on actual roads.

Last week’s fatal accident in Arizona involving a self-driving Uber SUV drew attention to autonomous cars that are out on public roads for safety testing. But developers are also tackling those tasks in data centers, running autonomous systems through simulated driving situations over and over, not waiting for them to occur in the real world.

These virtual trials may play a growing role in the development of self-driving cars as governments look more closely at potential dangers of real-world trials. Arizona Gov. Doug Ducey, who has made his state one of the most welcoming for self-driving cars, said Monday the state would seek to suspend Uber’s ability to test them. (Uber has halted all its US tests pending the outcome of the accident investigation.)

On March 27 at GTC, in San Jose, Nvidia announced Nvidia Drive Constellation, which it called the best system yet for rapidly simulating billions of miles of tests under the worst possible conditions. It will be available to early access customers in the third quarter of this year.

Constellation consists of two servers: One runs Nvidia Drive Sim software to generate the kinds of data that a self-driving car’s sensors would collect, and the other incorporates an Nvidia Drive Pegasus hardware-software platform like those designed to go into vehicles. Pegasus is the company’s top-of-the-line in-car platform for handling so-called Level 5, fully driverless operation. It includes two of Nvidia’s new Xavier automotive chips, GPUs and AI software.

“The Drive Pegasus that’s in the cloud… doesn’t know it’s in the cloud,” said Danny Shapiro, Nvidia’s senior director of automotive, in a briefing with reporters on Tuesday. “We’re testing in a virtual environment, but we’re testing in the actual hardware and software that would be in the car.”

Nvidia Drive Sim, running on GPUs, produces photorealistic simulated sensor data 30 times per second, he said. The Pegasus uses that data to make real-time decisions about things like turning, accelerating and braking. Its commands are fed back into the simulator to generate instant feedback about whether Pegasus is operating the virtual vehicle properly.

Driving simulations let developers test more iterations of their technology more quickly, leading to rapid improvement through machine learning, Shapiro said. “Edge cases” that are rare in the real world — such as a pedestrian crossing a road at night without a crosswalk, as happened in the Arizona incident — can be repeated many times in a simulator.

For example, the Nvidia Edge Sim server can simulate the blinding light of a sunset, a challenging condition for self-driving cars, 24 hours a day, Shapiro said.

“We can create potentially hazardous scenarios without putting anyone in harm’s way,” Shapiro said.

Constellation draws upon Nvidia’s expertise in generating photorealistic objects and environments. The company was a pioneer in GPUs, which saw their first major use in computer games and have constantly evolved to make those games more realistic.

Nvidia’s GPUs have powered in-car infotainment systems for years and are poised to play a big role in future connected and autonomous cars. The company says about 370 companies are using its Nvidia Drive hardware-software platform. Its partners include Toyota, Mercedes-Benz, Volkswagen, Bosch and more than 200 startups, such as Aurora, the closely watched company led by former top engineers from Google, Tesla and Uber.

— Stephen Lawson is a freelance writer based in San Francisco. Follow him on Twitter @sdlawsonmedia.

Leave a comment

Your email address will not be published. Required fields are marked *