As AI and robotics become an ever-more important presence in everyday life, the use cases are quickly going from science fiction to real life.
One of the most popular areas of interest is autonomous vehicles, self-driving cars able to navigate the roads and get us to our destination without needing to touch the wheel.
Article continues below
Hit the streets
Nvidia has been at the forefront of autonomous driving for some time, developing its Hyperion platform and the ecosystem on top of it, working with a range of the top automakers across the world, including the likes of Geely, BYD Nissan and Hyundai, alongside a long-standing collaboration with GM and Mercedes-Benz.
My drive lasted around 45 minutes in downtown San Jose, operating on a pre-set route that gave me a taste of how the technology would work in a variety of road set-ups and conditions, including single and multiple lane traffic in urban and suburban settings.
The Hyperion 8 technology we experienced was “level 2” of autonomous vehicles, meaning a human in the driver’s seat was able to interact with the process at any moment, with the ability to disengage the system by touching the brake pedal – and in fact, the car mandated that this person touch the steering wheel every now and then to ensure they weren’t distracted or asleep.
As anyone who has ridden in a self-driving car knows, the experience can be a bit alarming at first, (particularly as I was in the front passenger seat), but after the first few junctions, I was able to relax and enjoy the incredibly plush car.
The car itself featured 10 cameras and five radars, situated around the front and rear of the vehicle, allowing it to visualize the world around it and interact where needed. This was supported by Nvidia’s Alpamayo end-to-end stack, trained on real and synthetic data, alongside a fully traceable stack ensuring extra safety – although the set-up we experienced is not yet for sale, it should be available later in 2026.
Several actions instantly impressed me, showing the software’s capabilities in decision-making and intelligence.
For example, a city bus pulling out unexpectedly to avoid a parked car led our vehicle to indicate and pull into the adjoining lane, avoiding a collision. The vehicle was also able to spot an elderly person starting to cross the middle residential street not on an official crossing, and slowed and moved over to ensure we never made contact.
The car was also able to anticipate an upcoming turn on our pre-set route, getting into the correct lane a block early, meaning there was no need to jump a queue of turning vehicles or attempt a risky manoeuvre across other lanes.
A number of unprotected turns and stop sign junctions also showed how the technology allowed the car to pull forward effectively, but still being ready to stop in case of an unexpected collision risk.
Most impressively, when a large truck (somewhat bizarrely) reversed sharply from a parking lot across our lane of traffic and into the far lane (a situation which left even our experienced guide driver with a raised pulse), the car was able to brake to avoid a collision.
Overall, my journey in the autonomous vehicle was undoubtedly a success – the car felt in control and able to react to the world around it, so I never felt unsafe. With a wider roll-out scheduled for the next few years, it’ll be interesting to see how it develops, and I look forward to trying it out again soon.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
https://cdn.mos.cms.futurecdn.net/xmRKW3x3JucjYdRLe5qrvd-2560-80.jpg
Source link




