Uber has more than 20 autonomous vehicle partners, and they all want one thing: data. So the company says it will be available through a new division called Uber AV Labs.
Despite the name, Uber is not returned to develop its own robotaxis, which stopped after one of its test vehicles killed a pedestrian in 2018. (Uber eventually sold the division in 2020 in a Complex deal with Aurora.) But it will send its own car out into the city adorned with sensors to collect data for partners like Waymo, Waabi, Lucid Motors, and others – although the contract has not yet been signed.
In general, self-driving cars are in the midst of a shift away from rule-based operations and relying more on reinforcement learning. As this happens, real-world driving data becomes too expensive to train the system on.
Uber told TechCrunch the autonomous vehicle companies that want this data the most are those that already collect a lot of it themselves. This is a sign that, like many frontier AI labs, they understand that “solving” the most extreme edge cases is a volume game.
Physical limitations
Currently, the size of autonomous vehicle companies’ fleets puts a physical limit on the amount of data they can collect. And while many of these companies create simulations of the real environment to fence off edge cases, nothing beats driving on real roads – and driving a lot – when it comes to discovering all the strange, difficult, and flat-out unexpected scenarios that the car winds up.
Waymo exemplifies this gap. The company has had autonomous vehicles in operation or testing for a decade, but robotaxis has now caught on. through illegal school bus stops.
Having access to a larger pool of driving data could help robotaxi companies address some of these issues before or while they’re on the move, Uber chief technology officer Praveen Neppalli Naga told TechCrunch in an exclusive interview.
Techcrunch event
San Francisco
|
13-15 October 2026
And Uber won’t charge you. At least not yet.
“Our goal, primarily, is to democratize this data, right? I mean, the value of this data and the advancement of our partner AV technology is greater than any money we can make,” he said.
Uber’s VP of engineering Danny Guo said the lab needs to build a basic data base first before determining the product’s market fit. “Because if we don’t do this, we really don’t believe anyone else can,” Guo said. “So as someone who can unlock the whole industry and accelerate the whole ecosystem, we believe we have to take this responsibility now.”
Screws and sensors
The new AV Labs division started small. So far, it only has one car (the Hyundai Ioniq 5, although Uber says it’s not married to a single model), and Guo told TechCrunch that his team is still tinkering with sensors like lidar, radar, and cameras.
“We don’t know if the sensor kit will fall off, but that’s the scrappiness we have,” he said with a laugh. “I think it will take some time to say, put 100 cars on the road to start collecting data. But the prototype is there.
Partners will not receive raw data. Once the Uber AV Labs fleet is up and running, Naga said the division “must massage and work with the data to match partners.” This layer of “semantic understanding” is driving software at companies like Waymo to improve robotaxi’s real-time path planning.
Even then, Guo said there will be an interstitial step in the works, where Uber will connect its partner’s driving software to AV Labs’ cars so they can operate in “shadow mode.” Anytime an Uber AV Labs driver does something different from the autonomous vehicle software in shadow mode, Uber will alert its partner companies.
This not only helps find flaws in the driving software, but also helps train the model to drive more like a human and less like a robot, Guo said.
Tesla’s approach
If this approach sounds familiar, it’s actually what Tesla has been doing to train its own autonomous vehicle software for the past decade. Uber’s approach does not have the same scale, as Tesla has millions of customer cars driving on roads around the world every day.
That doesn’t bother Uber. Guo said he hopes to create more targeted data collection based on the needs of autonomous vehicle companies.
“We have 600 cities that we can pick and choose (from). If a partner tells us a certain city they are interested in, we can only install our (cars),” he said.
Naga said the company expects to grow this new division to several hundred people a year, and that Uber wants to move quickly. And as he looked to a future where Uber’s entire fleet could be used to collect more training data, he knew a new division had to start somewhere.
“From the conversation with the partner, he just said: ‘Give me anything that can be useful.’ Because the amount of data that Uber can collect is greater than anything that can be done with its own data collection,” Guo said.

