Technology companies are creating autonomous driving simulation software and hardware tools to help self-driving and autonomous cars become safer. Representatives from NVIDIA, Cognata, Foretellix, understand.ai and Ford’s Quantum AI explain how their products offer multiple scenarios for the future of autonomous driving.
There are two parts of autonomous driving. First, training has to take place and use it to teach the car how to drive better than a human. Then, it uses what it has learned when it has to make real-time decisions says Danny Shapiro, Senior Director of Automotive at NVIDIA, maker of Drive Constellation Simulator and many computer systems built-into self-driving vehicles.
“How do we know it is really safe? Simulation is the key element to make sure what we are putting on the road is really safe,” says Shapiro. He notes that NVIDIA has over fifty autonomous cars in its fleet all over the world each taking in data with thousands of people curating the data labelling lanes, signs, pedestrians to train the autonomous driving systems.
“A lot of autonomous miles driven are not meaningful miles – driving on the highway is pretty boring. We have to define the interesting scenarios,” says Shapiro who tells us that the training is on-going and massive infrastructure and data centres are being built up by the automotive industry for training
“More and more we are realising there are scenarios that have not been tested. We built a simulator with cameras, radar and LiDARs and took it out of the car but can still feed it signals, making a tool for automakers to test autonomous driving,” says Shapiro.
“We can put our sensors on virtual vehicles, create scenarios with objects that are moving and see what is happening.”
“We can do all kinds of things we can’t do in the real world. We can make all kinds of parameters through the simulator, to try things over and over again to make sure it all works. It is a key part of validation that the car that will be safe on the road actually works,” says Shapiro. Testing on the road is great, however, the problem is that is expensive and impossible to scale it out. It’s hard to get good coverage. You don’t necessarily get to test where you want to test. To test the software, high-fidelity data is needed and the ability to test in dangerous and rare situations, says Shapiro.
NVIDIA’s simulator is very sophisticated. Users can run tests simultaneously or the NVIDIA Constellation system lets automakers use what’s already on the marketplace ingest all data into the simulator. It can use traffic cameras on freeways. NVIDIA is working with LiDAR companies to integrate massive amounts of data and then generate synthetic data and see how it all works in the loop.
“We are working within the industry to come up with what is essentially an AV driver’s license with a very high bar that takes into account everything we can think of— to make sure AVs pass these tests before going on the road,” says Shapiro.
Simulation scenarios can include sunrise or sunset when sensors can get blinded, all kinds of weather and driving scenarios such as cars cutting over, running red lights, fast acceleration or surprise braking.
NVIDIA is working with over 400 companies on autonomous driving.
Shapiro wonders: “How do we create all the scenarios? It’s something the industry is coming together to solve.”
Cognata helps automakers and autonomous vehicle companies by getting their automotive vehicle testing into the Cognata simulation platform and helping them make most of their testing from the road to the platform.
“We do all kinds of data testing simulations and other scenarios We do it at a very large scale in the cloud, making a lot cheaper and faster. We want to cover the head and the tail that long tail, when something unexpected happens such as a bird is landing on the car. But we need to define how far we need to go. It’s a never-ending story,” says Shay Rootman Director of Partnerships and Development at Cognata that offers an end to end simulation platform that could work on its own or work with other platforms
Cognata is cloud-based and enables users to cover many scenarios, they usually start with a use-case in mind, such as wanting to identify a pedestrian better. Customers set up the scenario using the Cognata engine and then it can be run on a very large scale. “We support all phases of the simulation from training to analysis. It is a turnkey solution. The strength of Cognata is its photorealism aspect and AI platform for machine learning using our datasets at a higher resolution. Currently, users can simulate cameras and we are partnering with Innoviz for working with LiDAR,” says Rootman.
How do you know an autonomous driving system is safe? By testing in simulation, testing on tracks or driving on roads? Did you cover it all? Including edge cases? Foretellix uses M-SDL, the Measurable Scenario Description Language to help measure autonomous driving and then analyze the data.
“Everyone is testing on different platforms. You don’t know what you are covering. You want to know how far you are in the process. You need something more measurable. Our mission is to move to measurable safety to go from quantity of miles to quality of coverage,” says Ron Grosberg, research and development for Foretellix whose co-founders come from the chip industry and created coverage verification technology.
“You need to be able to know what you have covered where you are and what to do next. So, we developed a software platform,” says Grosberg about Fortellix’s Fortify that has three parts.
First is the language. The power of the language is the ability to specify what you want to measure such as describe scenario such as the speed of another car, location or weather, with the ability to create thousands of variants. The second part is generating the scenarios and the third part is aggregation and analysis.
Foretellix offers the productivity to handle a lot of scenarios, the probability to measure safety and portability to work with different tools. Foretellix is agnostic and will work with any simulation platform offering an API.
Some companies work with three-to-five simulators. Foretellix offers a dashboard to show results from multiple platforms and many scenarios.
While developing software for infotainment systems, engineers at LG realized that the simulation software they were developing would be great for autonomous driving simulation says, Steve Lemke, Principal Engineer, Advanced Platform Lab LG Electronics.
LGSVL Simulator for autonomous vehicle development is free, open-source software based on the UNITY gaming development platform that can be programmed using the Python language. Lemke says that it can easily run on a gaming laptop computer.
“Simulation needs to be photorealistic,” says Lemke, about LGSVL that can simulate LiDAR, cameras, GPS and radar tests.
Lemke also notes that simulation software can cost as much as $100,000. LG made the software open source to enable academics as well as auto companies to simulate autonomous driving while there may be paid versions in the future or LG may charge for custom simulations LGSVL is compatible with the open-source Apollo platform, Autoware platform and ROS/ROS2. The video is looks real by using the Unity High Definition Render Pipeline (HDRP). HD maps can be imported. LGSVL is also compatible with many different automotive computing suppliers such as Renesas, Qualcomm and Intel notes Lemke.
Why Do You Need AI in AV Simulation to Hunt Black Swans?
“Autonomous driving simulation is not enough unless you have the AI to go with it,” says Philip Kessler, Co-founder and CTO of understand.ai, now part of dSPACE.
“understand.ai is an autonomous simulation platform but also we do data logging on devices that record data from five cameras and four LiDARS then can save it all to a hard disk to collect the data to transform into simulations,” says Kessler, “We have an editor to create environment by hand or it can be based off video in the real world such as real-world roundabout – trees— or video generated from our customers.”
Black Swan scenarios are unusual scenarios that are out of the blue and could-be unthinkable and most likely don’t happen often in real life. How do you simulate Black Swans?
“You find the Black Swan through collecting a lot of data, such as our tier one customers with petabytes of data. We help them find the Black Swan anomaly through data selection then transform it into the virtual world, then you can add few thousand variations,” says Kessler.
dSPACE and understand.ai offer software and hardware simulation that can simulate the car without building the car says, Kessler.
Simulation is not only good for the driving aspects but also for improving the customer experience.
“Simulation plays a critical role in ensuring that systems work as intended, but also in providing a great customer experience,” says Dr Mitch Rohde, CEO of Quantum Signal AI, now part of the Ford Motor Company.
“We use simulation to analyze and fine-tune the ride quality, user interactions, and other systems so customers have the great experience that they expect from a Ford. Smooth motion control and intuitive interfaces align machine performance with human expectation, building trust and acceptance.”