NVIDIA Hits the Road with Autonomous Vehicle Hardware at GTC

November 16, 2021 by Tyler Charboneau

At NVIDIA's GPU Technology Conference, the company unveiled its roadmap to bring autonomy to more vehicles—dependent on chipsets, sensors, GPUs, cameras, and more.

The sun has officially set on NVIDIA’s 2021 GPU Technology Conference (GTC), and this year’s sessions again highlighted an intriguing lineup of emerging technologies. Among other industries, automotive was a shining star for NVIDIA. The company unveiled numerous platforms supporting autonomous driving and continuous AI learning.

What’s in the pipeline, and who stands to benefit? Let’s break down each development one by one. 


NVIDIA’s Autonomous Vision

NVIDIA CEO Jensen Huang laid out his company’s plan for the future—including accelerated computing, data center architecture, robotics, AI, and more. The automotive segment is one of many, though the company has forged numerous partnerships with global automakers. Household names like Mercedes Benz, Volvo, Hyundai, Lotus, Sony, and others have already signed on to leverage its upcoming platforms. A number of EV and self-driving OEMs are also in the mix. 


NVIDIA's autonomous computing platform

NVIDIA's autonomous computing platform. 

NVIDIA’s goal is to create a full-fledged self-driving technology stack. This starts with dedicated hardware and ends with artificial intelligence-powered software. These core components will work in unison to enable autonomous functionality—dependent on chipsets, sensors, GPUs, cameras, and more. Interestingly, the platform is designed for open compatibility with systems from Luminar, HELLA, Valeo, and Sony. 

NVIDIA hopes manufacturers can take their open-stack technologies and apply them to different products. According to Huang, the vast majority of EVs will have substantial autonomous vehicle (AV) capabilities by 2024. 

How is this possible? First, NVIDIA has introduced a number of software components to aid AV development teams:

  • 150 new acceleration libraries, many of which are dedicated to AI (critical for AV training and on-road operation)
  • 65 new-and-updated SDKs to further support software development using NVIDIA’s platform
  • The Omniverse Replicator for DRIVE Sim—made specifically for training deep neural networks through synthetic data generation

Tier-one suppliers, sensor makers, software startups, robotaxi companies, and more will rely on these resources and NVIDIA’s compute capabilities. What components are now available?



NVIDIA says its Hyperion 8 is a computer architecture and sensor set for dedicated, self-driving systems. This sensor suite packages 12 cameras, 12 ultrasonics, nine radars, and a forward-facing LiDAR sensor. 

This LiDAR unit is Luminar's Iris sensor. Iris is automotive grade and particularly favorable due to its low cost and scalability from 10,000 to 1,000,000 or more vehicles. It can detect roads and drivable space to 80 meters, lane markings to 150 meters, plus vehicles and objects to 250 meters. 

Hella provides short-range radar solutions for Hyperion, while Continental brings long-range capabilities into the fold. Sony and Valeo have integrated their cameras into this autonomous package—while the latter’s distance-measuring, ultrasonic sensors will spot objects in the vehicle’s path. 



Presentation of NVIDIA DRIVE self-driving car. 

The development stack is equally important. NVIDIA’s standard SDK includes its own Ampere architecture GPUs for AI testing and validation. This will help ensure that the car’s autonomous brain can work in real-time in the real world. Next, the Driveworks Sensor Abstraction Layer leverages plug-ins to simplify sensor setup. The companion DRIVE AV Software houses the neural networks behind mapping, perception, control, and planning. 

Hyperion 8 is designed to decrease time to market for any OEMs that use it. Hyperion also features some key hardware components at its core.



DRIVE Orin is a system-on-chip (SoC) that’s specifically designed to handle massive quantities of neural networks and autonomous applications simultaneously. The SoC achieves 254 trillion operations per second (TOPS). It contains 17 billion transistors while performing almost seven times better than its Xavier predecessor. Orin also remains software compatible with Xavier. 

Orin supports Level 2+ AI-assisted driving to Level 5 fully-driverless operation. It’s also compliant with ISO 26262 and ASIL-D safety regulations. 



DRIVE Orin offers AV confidence view and driver/occupant monitoring, among other features. 

Over 15 OEMs are slated to use DRIVE Orin within their upcoming EV and AV applications. NVIDIA claims Orin will help enable the following

  • Confidence-view visualization (3D rendering of surroundings model)
  • Digital clusters (driver information) 
  • Infotainment
  • Interactive passenger AI

NVIDIA thus describes Orin as a four-domain chipset. It operates as the AV’s brain throughout its lifetime. It also presents a major opportunity for consolidation within the vehicle’s electrical system. While tens of electrical control units (ECUs) power a traditional vehicle’s (and even an EV’s) electrical functions, Orin is said to handle all of those responsibilities singlehandedly.

The system is also upgradeable over extended periods of time. 


DRIVE Concierge

Concierge bundles a number of in-cockpit services together—using NVIDIA’s hardware and software stacks—to enable hands-free parking and AI interaction. Concierge uses DRIVE IX and Omniverse Avatar to power services that are always-on. 

The system is designed to make one’s everyday driving experience more streamlined. It’s billed as an intelligent assistant that can decipher the dialogue used throughout normal conversations. DRIVE Concierge will help drivers make phone calls, book reservations, and push contextual alerts as needed. The driver interacts with a digitally-traced character throughout his or her journey. It’s an interesting middle ground for OEMs that want to provide autonomous AI functionality, yet don’t have a fully-functional AV currently available. 


NVIDIA's automated parking architecture

NVIDIA's automated parking architecture.

For the complete AV, Concierge gives way to Chauffeur—which is purpose-built to navigate highway and urban traffic. Its role is to offload tasks to the car itself instead of requiring human input. Concierge and Chauffeur especially use Hyperion 8’s sensory and compute components. 


Driving the Future

The current level of buy-in from startups and established automakers is an encouraging sign for NVIDIA. The company continues to push its software and hardware offerings forward—confident after successful trials with older systems like Xavier.

DRIVE Hyperion 8, DRIVE Orin, DRIVE Concierge, and DRIVE Chauffeur may have a promising future. As the company continues to pour billions into autonomous R&D, we can expect further maturation of these platforms.