homehome Home chatchat Notifications


Tesla's first self-driving accident just happened. It's time to start a serious discussion

It was bound to happen. This week, Jared Overton, a Tesla Motors Model S owner, reported his car spontaneously started itself and drove into a trailer parked right in front of it. No one was hurt, but the car itself broke the windshield. Tesla Motors claims their software wasn't to blame. Instead, internal logs suggest Overton used the 'summon' feature by mistake from his phone. Overton disagrees. It seems like a lawsuit might follow, but the accident begs a more important discussion: are Tesla's autonomous features really safe?

Tibi Puiu
May 13, 2016 @ 2:50 pm

share Share

It was bound to happen. This week, Jared Overton, a Tesla Motors Model S owner, reported his car spontaneously started itself and drove into a trailer parked right in front of it. No one was hurt, but the car itself broke the windshield. Tesla Motors claims their software wasn’t to blame. Instead, internal logs suggest Overton used the ‘summon’ feature by mistake from his phone. Overton disagrees. It seems like a lawsuit might follow, but the accident begs a more important discussion: are Tesla’s autonomous features really safe?

Jared Overton's Tesla Model S slammed into a parked trailer after the summon feature was activated. It's not clear yet if this was a human error or a Tesla bug. Credit: KSL

Jared Overton’s Tesla Model S slammed into a parked trailer after the summon feature was activated. It’s not clear yet if this was a human error or a Tesla bug. Credit: KSL

Speaking to a KSL reporter, Overton said he was running an errand at a local business. Five minutes after he went inside, he found his car had driven by itself under a parked trailer.

“We were trying to figure out how on earth the vehicle started on its own,” Overton said. “What happened with this kind of rogue vehicle?”

Following the accident, Tesla issued a statement saying the crash was due to Overton using the ‘summon’ feature which directs the car to park near where the driver is standing. The software, which is in its ‘Beta’ version, can be used to “open your garage door, enter your garage, park itself, and shut down”, and can also operate in reverse when you need to leave.

“Tesla has reviewed the vehicle’s logs, which show that the incident occurred as a result of the driver not being properly attentive to the vehicle’s surroundings while using the Summon feature or maintaining responsibility for safely controlling the vehicle at all times,” the letter signed by a regional service manager read.

“Safety is a top priority at Tesla, and we remain committed to ensuring our cars are among the absolute safest vehicles on today’s roads,” the statement read. “It is paramount that our customers also exercise safe behavior when using our vehicles — including remaining alert and present when using the car’s autonomous features, which can significantly improve our customers’ overall safety as well as enhance their driving experience.”

Overton isn’t convinced, though.

“Even during that 15, 20 second walk right here, we would have easily heard the impact of the vehicle into the back of the trailer,” Overton said. “They can tell me what they want to tell me with the logs, but it doesn’t change what we know happened here. They’re just assuming that I sat there and watched it happen, and I was OK with that.”

“I think it behooves them to figure out what happened, what happened with the vehicle, address it,” Jared Overton said Tuesday. “Just fix it.”

According to Tesla, the vehicle’s sensors might not detect some obstacles that are lower than the fascia or hanging from the ceiling. The company says the summon feature should be used responsibly, namely users should never use it while having the vehicle unsupervised and should standby to stop the vehicle via the mobile app if something goes wrong.  “You must maintain control and responsibility for your vehicle when using this feature and should only use it on private property,” a Tesla spokesman said.

As for Tesla’s most important autonomous feature, the “autopilot mode”, there were no accidents reported yet despite the cars have logged millions of miles so far. Moreover, there have been instances in which lives were saved by it, as Ohioan Joshua Brown testified. Two weeks ago, Elon Musk claimed autopilot is already reducing car accidents by 50%.

Like Tesla said earlier, “safety is a top priority” for the company, but what happens if there’s a car crash on the road because of a self-driving feature? Will people remember there’s a net benefit, i.e. autopilot avoids more crashes than it produces? Or will people panic and drive the Tesla by themselves from there on? If there’s a fatal accident involving autopilot on the highway, the press will be all over it. It could even doom the company and set back self-driving cars a decade. It would be a shame too. The Economist reported in August that “if 90% of cars on American roads were autonomous, accidents would fall from 5.5m a year to 1.3m.”

It’s an interesting psychological response, one for which nobody is really prepared given the tech is as alien as flying saucers at the moment.

 

share Share

The Universe’s First “Little Red Dots” May Be a New Kind of Star With a Black Hole Inside

Mysterious red dots may be a peculiar cosmic hybrid between a star and a black hole.

Peacock Feathers Can Turn Into Biological Lasers and Scientists Are Amazed

Peacock tail feathers infused with dye emit laser light under pulsed illumination.

Helsinki went a full year without a traffic death. How did they do it?

Nordic capitals keep showing how we can eliminate traffic fatalities.

Scientists Find Hidden Clues in The Alexander Mosaic. Its 2 Million Tiny Stones Came From All Over the Ancient World

One of the most famous artworks of the ancient world reads almost like a map of the Roman Empire's power.

Ancient bling: Romans May Have Worn a 450-Million-Year-Old Sea Fossil as a Pendant

Before fossils were science, they were symbols of magic, mystery, and power.

This AI Therapy App Told a Suicidal User How to Die While Trying to Mimic Empathy

You really shouldn't use a chatbot for therapy.

This New Coating Repels Oil Like Teflon Without the Nasty PFAs

An ultra-thin coating mimics Teflon’s performance—minus most of its toxicity.

Why You Should Stop Using Scented Candles—For Good

They're seriously not good for you.

People in Thailand were chewing psychoactive nuts 4,000 years ago. It's in their teeth

The teeth Chico, they never lie.

To Fight Invasive Pythons in the Everglades Scientists Turned to Robot Rabbits

Scientists are unleashing robo-rabbits to trick and trap giant invasive snakes