homehome Home chatchat Notifications


A Cartoonish Crash Test Raises Real Questions About Tesla’s Autopilot

Mark Rober pits Tesla against lidar in a real-life Wile E. Coyote experiment.

Tudor Tarita
March 19, 2025 @ 10:34 am

share Share

Mark Rober wanted to know if Tesla’s self-driving car could be deceived in the most cartoonish way possible.

The former NASA engineer and YouTube sensation built a wall designed to trick a car. The wall stretched across the road, painted to look like the asphalt continuing straight ahead. Would Tesla’s Autopilot system recognize the deception in time to stop? Or would it speed forward like Wile E. Coyote chasing the Road Runner?

In the moment of truth, a Tesla Model Y, equipped with its camera-based Autopilot system, barreled forward at 40 miles per hour. The result was spectacular in the worst possible way: a gaping, cartoon-style hole as the car smashed through the fake road. Meanwhile, a second vehicle—this one fitted with Lidar, a laser-based sensing system—stopped cleanly before impact.

The video was an instant hit, racking up 10 million views in just two days. But as with anything related to Tesla, the crash test didn’t just spark curiosity—it ignited a firestorm.

Beep Beep!
Beep Beep! Credit: Mark Rober

Camera vs Lidar

Tesla’s approach to driver assistance has long been a subject of debate. Unlike most autonomous vehicle developers, who rely on a combination of cameras, radar, and lidar, Tesla has doubled down on vision alone, no Lidar. The company removed radar from its vehicles in 2021, betting that neural networks trained on camera data could replicate—and eventually surpass—human perception.

Elon Musk has called Lidar a “fool’s errand.” But Rober’s test suggests that, at least for now, the technology has a clear advantage. The Lidar-equipped vehicle correctly identified the fake road as an obstacle, while the Tesla—trusting its cameras—saw only an open highway.

That wasn’t Tesla’s only fumble. In a separate test, Autopilot successfully avoided a stationary dummy and another that suddenly ran into its path. But in fog and heavy rain, it failed, flattening the child-sized dummy. The Lidar system, by contrast, detected the mannequin every time.

This shouldn’t have been a surprise. Cameras struggle with poor visibility. Lidar, which actively scans the environment using lasers, doesn’t. The technology is more expensive and requires significant data processing, but as Rober’s experiment demonstrated, it can see what cameras miss.

Nnnnnope!
Nnnnnope! Credit: Mark Rober

Controversy and Conspiracies

The test was not without controversy, however. Some Tesla supporters questioned whether Autopilot had even been engaged during the wall crash. Others claimed Rober manipulated the footage, secretly pushing an anti-Tesla agenda on behalf of Big Lidar.

The scrutiny became so intense that Rober released unedited footage showing that Autopilot had, in fact, been active. But eagle-eyed viewers noticed something else: just before impact, the system appeared to disengage. That led to a new round of speculation—was this a deliberate Tesla feature to avoid responsibility for crashes?

It wouldn’t be the first time the issue had come up. In 2022, the National Highway Traffic Safety Administration (NHTSA) investigated dozens of Tesla crashes involving stationary emergency vehicles. In 16 cases, Autopilot “aborted vehicle control less than one second prior to the first impact.” Critics suspect this is a convenient way to avoid liability. Unsurprisingly, Tesla has denied any wrongdoing.

Well that went well
Well that went well. Credit: Mark Rober

The Real Takeaway

Rober’s test wasn’t perfect. We’re not sure if anything was tampered. Ultimately, the video was designed to be entertaining, and some elements, like the exaggerated hole in the wall, were added to the spectacle. But the core lesson is hard to ignore: Autopilot is not a true self-driving system. It’s a Level 2 driver assistance feature, meaning the driver is expected to remain engaged at all times.

Simply put, you can’t rely on it. You’re still driving the car.

Tesla’s defenders argue that Full Self-Driving (FSD), the company’s more advanced software, wasn’t tested. But FSD relies on the same camera-based approach, raising questions about whether it would have fared any better.

And while a painted wall might seem like an absurd scenario, the same underlying problem—camera-based systems misinterpreting their surroundings—has led to real-world tragedies. In 2016, a Tesla driver was killed when Autopilot failed to recognize a truck trailer crossing its path. The system mistook the bright white trailer for open sky.

Even if most drivers won’t encounter a Wile E. Coyote-style trap, fog, rain, and other visibility issues are everyday realities. And if a system that claims to be the future of autonomous driving can’t handle those, what else is it missing?

share Share

This Film Shaped Like Shark Skin Makes Planes More Aerodynamic and Saves Billions in Fuel

Mimicking shark skin may help aviation shed fuel—and carbon

China Just Made the World's Fastest Transistor and It Is Not Made of Silicon

The new transistor runs 40% faster and uses less power.

University of Zurich Researchers Secretly Deployed AI Bots on Reddit in Unauthorized Study

The revelation has sparked outrage across the internet.

Teen Influencer Watches Her Bionic Hand Crawl Across a Table on Its Own

The future of prosthetics is no longer science fiction.

Meet the Indian Teen Who Can Add 100 Numbers in 30 Second and Broke 6 Guinness World Records for Mental Math

The Indian teenager is officially the world's fastest "human calculator".

NASA Captured a Supersonic Jet Breaking the Sound Barrier and the Image Is Unreal

The coolest thing about this flight is that there was no sonic boom.

Fully Driverless Trucks Hit Texas Highways (This Time With No Human Oversight)

Driverless trucks will haul freight in Texas without a human behind the wheel.

AI Is Changing Education — But Are We Keeping Up?

Ever since tools like ChatGPT and DeepSeek hit the mainstream, they’ve shaken up everything from office tasks to art generation. Unsurprisingly, students quickly saw the potential — and began using AI to cheat on essays and exams. At first, it felt like a shortcut. But if AI can ace your test, what does that say […]

A Woman Asked ChatGPT for a Palm Reading and It Flagged a Mole That Might Be Cancer

A viral TikTok recounts the story of a young woman who turned to ChatGPT for love advice but received an unsolicited medical advice instead.

Japan Plans to Beam Solar Power from Space to Earth

The Sun never sets in space — and Japan has found a way to harness this unlimited energy.