[AutonomousVehicles] New NVidia Autonomous Vehicle Technology is in new vehicle shipping this year
Cornelius Butler
corn at butlernewmedia.com
Sun Jan 11 17:36:39 UTC 2026
NVidia has launched a new self driving technology platform. The
announcement came at CES. Here is an overview with link and text. This is
going to be bringing self driving technology to a lot of vehicles.
Article lInk:
https://electrek.co/2026/01/05/nvidia-unveils-open-source-ai-for-autonomous-driving-ships-in-mercedes-benz-cla-in-q1-2026/
Article TExt:
Nvidia unveils open-source AI for autonomous driving, ships in
Mercedes-Benz CLA in Q1 2026
Avatar for Fred Lambert
Fred Lambert
| Jan 5 2026 - 5:53 pm PT
90 Comments
Nvidia (NVDA) held its CES 2026 keynote today, and as expected, Jensen
Huang dropped a massive amount of news on the autonomous driving front. The
biggest takeaway? Nvidia is moving beyond just “perceiving” the road to
“reasoning” about it with a new family of open-source models called
Alpamayo, which will power new autonomous and driver-assistance features.
Starting with Mercedes-Benz as soon as this quarter.
Here’s the breakdown of everything Nvidia announced for self-driving
technology today.
The ‘Alpamayo’ Reasoning Model
Nvidia is calling this the “ChatGPT moment for physical AI.”
The company unveiled Alpamayo, a family of open-source AI models designed
to solve the “long tail” problem of autonomous driving, those rare, weird
edge cases that usually cause self-driving stacks to disengage or fail.
The flagship is Alpamayo 1, a 10-billion-parameter Vision-Language-Action
(VLA) model. Unlike traditional AV stacks that just detect objects and plan
a path, Alpamayo uses “chain-of-thought” reasoning. It processes video
input and generates a trajectory, but crucially, it also outputs the logic
behind its decision.
Jensen Huang explained that the model can “think through rare scenarios”
and explain its driving decisions.
To sweeten the deal for developers, Nvidia is going the open-source route.
They are releasing:
Alpamayo 1 model weights on Hugging Face.
AlpaSim, an open-source end-to-end simulation framework.
Physical AI Open Datasets, containing over 1,700 hours of driving data
covering complex scenarios.
This is a clear play to become the default “Android of Autonomy” while
Tesla continues to keep its Full Self-Driving (FSD) stack completely closed.
Mercedes-Benz CLA: the first with NVIDIA’s new AV stack
We’ve been hearing about the Nvidia-Mercedes partnership for years, but
today we got a concrete timeline.
Huang confirmed that the 2025 Mercedes-Benz CLA will be the first
production vehicle to ship with Nvidia’s entire AV stack, including the new
Alpamayo reasoning capabilities.
While it’s officially launching as a “Level 2+” system, much like Tesla’s
‘Full Self-Driving’, which in reality is a level 2 driver assistance system
as it requires attention from the driver at all times, it appears that the
goal is to push toward level 4 capabilities.
Here’s how Mercedes describes the system right now:
With Mercedes-Benz’s MB.DRIVE ASSIST PRO, driving assistance and navigation
merge to create a completely new and safe driving experience. At the press
of a button, the vehicle can help navigate through the city streets – from
the parking lot to the destination – with advanced SAE-Level 2 assistance.
Thanks to Mercedes-Benz’s cooperative steering approach, steering adaptions
are possible at any time without deactivating the system.
The sensor stack consists of 30 sensors, including 10 cameras, 5 radar
sensors and 12 ultrasonic sensors.
The Hardware: Vera Rubin
Powering all this backend training and simulation is Nvidia’s new Vera
Rubin platform, the successor to Blackwell. It’s a six-chip AI platform
that Nvidia claims is now in full production. While much of this is
data-center focused, the “Rubin” GPUs and “Vera” CPUs are what will likely
be training the next iterations of Alpamayo that end up in your car.
Electrek’s Take
This is a very interesting move from Nvidia.
The fact that Alpamayo outputs a “reasoning trace” is huge for regulators
who are terrified of black-box AI models crashing cars without us knowing
why.
The open-source aspect is also brilliant. By giving away the model and the
simulator, Nvidia ensures that startups and other automakers get hooked on
their CUDA ecosystem. If you can’t build an autonomous system by yourself
(which, let’s be honest, most legacy automakers can’t), you now just grab
Alpamayo and run it on Nvidia chips.
As for the Mercedes CLA, “Level 2+” that feels like plans to deliver
something like Tesla has with FSD without the promise of unsupervised
self-driving, something Tesla has consistently failed to deliver despite
selling it to its customers since 2016.
If Mercedes actually ships a car in Q1 that can have similar capabilities
as Tesla’s FSD, and it is based on an open-sourced system that any
automaker can buy, it could shake up the industry and start to commoditize
this idea of “level 2+” autonomous systems.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://nfbnet.org/pipermail/autonomousvehicles_nfbnet.org/attachments/20260111/8db36fb4/attachment.htm>
More information about the AutonomousVehicles
mailing list