INTERVIEW

Share this article

NVIDIA on tomorrow’s car - perpetually upgradeable machines

With vehicles averaging 11 years on the road, the opportunities to surprise and delight motorists are few and far between. AI, software and high-performance computing is changing how the auto industry operates, from how vehicles are made to how we experience them.  NVIDIA is among those pushing back the technical boundaries of AI and partnering with Mercedes-Benz to help make its fleet become perpetually upgradeable from 2024. To learn more, Matthew Beecham spoke to Danny Shapiro, senior director automotive at NVIDIA.

Achieving the performance standards necessary for SAE Levels 3 – 5 driver autonomy at lower costs requires a fresh approach. What's yours?

Autonomous vehicles are transforming the way we live, work, and play - creating safer and more efficient roads. These revolutionary benefits require massive computational horsepower and large-scale production software expertise. Tapping into decades-long experience in high-performance computing, imaging, and AI, NVIDIA has built a software-defined, end-to-end platform for the transportation industry that enables continuous improvement and continuous deployment through over the air updates. It delivers everything needed to develop autonomous vehicles at scale, creating vehicle fleets that get better and better over time.

We understand that NVIDIA is partnering with Mercedes-Benz to develop AV software. Could you tell us a little more about that and your aims for the partnership?

Starting in 2024, every next-generation Mercedes-Benz vehicle will include a first-of-its-kind software-defined computing architecture built on the NVIDIA Drive platform. This includes the most powerful on-board computer, system software and applications for consumers, marking the turning point of traditional vehicles becoming high-performance, updateable computing devices.


Primary features will be the ability to drive regular routes from address to address autonomously and Level 4 automated parking, in addition to countless safety and convenience applications.


These revolutionary vehicles are enabled by NVIDIA Drive AGX Orin, with multiple processing engines for high-performance, energy efficient compute and AI, and equipped with surround sensors. Primary features will be the ability to drive regular routes from address to address autonomously and Level 4 automated parking, in addition to countless safety and convenience applications. These capabilities will continue to get better, and new features will be released, as AI technology advances.

We are hearing that while manufacturers remain excited about automated driving, the challenge will require more time and effort to fully realize. What's your view on the path towards autonomous driving?

Ultimately all vehicles will become automated or autonomous. As a solutions provider for the entire transportation industry, we are delivering the end-to-end tools necessary to deploy this technology safely and efficiently. Autonomous driving pilots are underway all over the world, leveraging NVIDIA technology to operate on public roads - robotaxis, delivery bots, long haul trucking and platooning. However, the vehicle manufacturers and regulators will ultimately determine when these fleets can begin operating at scale.

Do you expect to see a bigger role for simulation as part of the training, validation and testing of AVs?

Simulation is a critical aspect of autonomous vehicle development. It enables testing and validation in diverse, rare and hazardous scenarios without having to search out these situations in the real world. With NVIDIA Drive Constellation and Drive Sim software, autonomous vehicle developers can perform these tests in the virtual world at a massive scale.


NVIDIA Drive Constellation is a cloud-based simulation platform, designed from the ground up to support the development and validation of autonomous vehicles. The data center-based platform consists of two side-by-side servers.


The first server uses NVIDIA RTX GPUs running Drive Sim software and generates the sensor output from the virtual car driving in a virtual world. The second server contains the actual vehicle computer, processing the simulated sensor data running the exact same Drive AV and Drive IX software that's being deployed in the real car.


The driving decisions from the second server are fed back into the first, enabling real-time, bit-accurate, hardware-in-the-loop development and testing.

How can consumers take advantage of AI capabilities without having to switch to a new vehicle?

Like all modern computing devices, software-defined vehicles are supported by a large team of AI and software engineers, dedicated to improving the performance and capability of the car as technology advances.


With a software-defined architecture, automakers can add capabilities and services over-the-air during the life of the car.


With a software-defined architecture like that in the upcoming Mercedes-Benz fleet, automakers can add capabilities and services over the air at any time through the life of not just one customer, but across customers through the life of the car.

What does the trend for shared mobility mean for NVIDIA in terms of its software products?

The NVIDIA Drive Software stack is open, empowering developers to efficiently build and deploy a variety of state-of-the-art AV applications, including perception, localization and mapping, planning and control, driver monitoring, and natural language processing. Anyone building autonomous vehicles, including automakers, tier-1 suppliers, robotaxi companies and mobility startups, can leverage parts of the stack or all of it to develop their products.

Presumably using eyes, voice and hand gestures, it is possible to eliminate buttons from an infotainment system. What is your vision of this touch-free user experience?

AI is enabling a range of new software-defined, in-vehicle capabilities across the transportation industry. With centralized, high-performance compute, automakers can now build vehicles that become smarter over time.


Using natural language processing, drivers can control vehicle settings without taking their eyes off the road. Conversational AI enables easy access to search queries, like finding the best coffee shops or sushi restaurants along a given route. The same system that monitors driver attention for advanced driver assistance systems can also interpret gesture controls, providing another way for drivers to communicate with the cockpit without having to divert their gaze. The new Mercedes-Benz S Class powered by NVIDIA is a first step on this AI journey.

We are hearing how augmented reality HUDs are getting closer to the market. How can NVIDIA help?

NVIDIA GPUs power crystal-clear graphics, which can inform a driver about their route, as well as what the sensors on the car see, quickly and easily. Augmented reality heads-up displays and virtual reality views of the vehicle's surroundings deliver the most important data (such as parking assistance, directions, speed and oncoming obstacles) without disrupting the driver's line of sight. The new S Class also features an AR head-up-display as part of the digital cockpit experience.

Is there such a thing as a cyber secure car?

Safety and security are our number one priority, and we have invested hundreds of millions of dollars to secure our end to end platform, and develop safe products. An autonomous vehicle platform can't be considered safe without cybersecurity - solid cybersecurity engineering practices and development. To deliver a best-in-class automotive security platform with high consumer confidence, we've built a world-class security team, aligned with government and international standards and comply with all regulations.

Share this article