The autonomous vehicle industry reached a significant milestone at CES 2026 with the unveiling of a production-ready robotaxi developed through an unprecedented collaboration between Uber Technologies, Lucid Motors, and autonomous driving specialist Nuro. The luxury electric robotaxi, based on Lucid’s Gravity SUV platform, is already undergoing supervised on-road testing in the San Francisco Bay Area ahead of its planned commercial launch later in 2026.
The partnership combines Lucid’s advanced electric vehicle engineering, Nuro’s proven Level 4 autonomous driving technology, and Uber’s global ride-hailing network into what the companies describe as the industry’s most luxurious autonomous ride-hailing experience. The collaboration builds on Uber’s $300 million investment in Lucid announced in September 2025, which committed the ride-hailing giant to deploying up to 20,000 Lucid vehicles equipped with autonomous technology over a six-year period.
The robotaxi revealed at CES represents a production-intent vehicle, meaning it closely resembles the final version that will enter service. Built on the all-electric Lucid Gravity platform, the vehicle offers up to 450 miles of range and fast-charging capabilities that maximize availability for continuous operation. The spacious SUV configuration can accommodate up to six passengers comfortably with generous luggage space, positioning it as a premium solution for group travel and airport trips.
What distinguishes this robotaxi from competing autonomous vehicles is the comprehensive sensor array integrated throughout the vehicle. High-resolution cameras, solid-state lidar sensors, and advanced radar systems provide 360-degree environmental perception. These sensors are embedded throughout the Gravity’s body and integrated into a purpose-designed roof-mounted module called the Halo, which maintains a low profile to preserve the vehicle’s sleek appearance while housing critical sensing hardware.
The Halo serves multiple functions beyond sensor housing. Integrated LED lights display rider initials to help passengers identify their vehicle in crowded pickup areas, similar to the systems used by Waymo’s autonomous Jaguar I-Pace fleet. The LEDs also provide clear status updates throughout the ride journey from initial pickup confirmation through the final drop-off, creating visual feedback that helps riders understand what the vehicle is doing.
Powering the autonomous driving system is Nvidia’s DRIVE AGX Thor computing platform, part of the broader DRIVE Hyperion architecture designed specifically for self-driving vehicles. This high-performance compute system processes massive amounts of sensor data in real time, running the AI models that enable the vehicle to perceive its environment, predict the behavior of other road users, and plan safe, comfortable paths through complex urban environments.
Nuro’s autonomous driving software represents years of development and real-world testing. The company initially focused on autonomous delivery vehicles before expanding its technology platform to passenger applications. Nuro’s end-to-end AI foundation model blends cutting-edge neural networks with what the company describes as clear, verifiable safety logic, ensuring that autonomous decisions can be understood and validated rather than operating as opaque black boxes.
The testing program Nuro is leading in the Bay Area follows a comprehensive safety validation framework developed through the company’s commercial autonomous deployments. Engineering prototypes equipped with full sensor suites and autonomous capabilities are being driven under supervision by trained safety operators who can intervene if necessary. The testing evaluates dozens of critical capabilities across the entire autonomy stack, from basic perception and path planning to complex scenarios involving pedestrians, cyclists, and unpredictable traffic situations.
Beyond on-road testing, the validation program includes extensive closed-course testing at dedicated facilities and massive-scale simulation that exposes the autonomous system to millions of miles of virtual driving scenarios. This multi-layered approach aims to verify that the robotaxi can safely handle the full range of situations it might encounter during commercial operations.
Inside the vehicle, Uber has designed an in-cabin experience focused on passenger comfort, transparency, and control. Interactive touchscreens allow riders to personalize their journey by adjusting climate settings, selecting music, controlling heated seats, and accessing other amenities. The screens also provide options to contact Uber support or request that the vehicle pull over, maintaining passenger agency even in a fully autonomous experience.
A key feature is the real-time visualization system that shows passengers what the robotaxi perceives and how it plans to navigate. The display presents an isometric graphical view of the vehicle moving through city streets with representations of nearby cars, pedestrians, cyclists, and other objects. As the robotaxi encounters different situations, the display shows its decision-making process including yielding to pedestrians, stopping at traffic lights, changing lanes, and executing safe drop-offs.
This transparency addresses one of the persistent challenges of autonomous vehicles: passenger trust. By making the vehicle’s perception and decision-making visible, the system helps riders understand that the robotaxi is aware of its surroundings and making appropriate driving decisions. The approach draws lessons from Waymo’s autonomous fleet, which pioneered many of these user experience concepts.
A crucial manufacturing advantage comes from Lucid’s decision to integrate the autonomous hardware directly on the production line at its Casa Grande, Arizona factory. Unlike competitors such as Waymo that must partially disassemble purchased vehicles to retrofit autonomous systems, the Lucid-Nuro robotaxis will be built from the ground up as autonomous vehicles. This integrated manufacturing approach saves time and money while potentially improving reliability by eliminating the need to modify production vehicles.
Pending final validation of the testing program currently underway, production of commercial robotaxis is scheduled to begin at Lucid’s Arizona facility later in 2026. The companies have not provided a specific timeline for when the first autonomous rides will become available to Uber customers in the Bay Area, though the indication of a late-2026 launch suggests commercial operations could begin within the year.
The San Francisco Bay Area represents a strategically important launch market for several reasons. The region has become a testing ground for multiple autonomous vehicle programs, creating a population that is relatively familiar with self-driving technology. Local regulatory frameworks have evolved to accommodate autonomous vehicle testing and deployment. The area’s diverse traffic conditions, from dense urban streets to highway driving, provide a comprehensive environment for validating autonomous capabilities.
If successful, the program will expand to dozens of markets globally over the subsequent six years as Uber works toward its commitment to deploy up to 20,000 Lucid autonomous vehicles. This scale would make the partnership one of the largest autonomous ride-hailing programs announced to date, potentially representing a significant step toward mainstream adoption of robotaxi services.








