Tesla’s first self-driving accident just happened. It’s time to start a serious discussion
It was bound to happen. This week, Jared Overton, a Tesla Motors Model S owner, reported his car spontaneously started itself and drove into a trailer parked right in front of it. No one was hurt, but the car itself broke the windshield. Tesla Motors claims their software wasn't to blame. Instead, internal logs suggest Overton used the 'summon' feature by mistake from his phone. Overton disagrees. It seems like a lawsuit might follow, but the accident begs a more important discussion: are Tesla's autonomous features really safe?
It was bound to happen. This week, Jared Overton, a Tesla Motors Model S owner, reported his car spontaneously started itself and drove into a trailer parked right in front of it. No one was hurt, but the car itself broke the windshield. Tesla Motors claims their software wasn’t to blame. Instead, internal logs suggest Overton used the ‘summon’ feature by mistake from his phone. Overton disagrees. It seems like a lawsuit might follow, but the accident begs a more important discussion: are Tesla’s autonomous features really safe?
Speaking to a KSL reporter, Overton said he was running an errand at a local business. Five minutes after he went inside, he found his car had driven by itself under a parked trailer.
“We were trying to figure out how on earth the vehicle started on its own,” Overton said. “What happened with this kind of rogue vehicle?”
Following the accident, Tesla issued a statement saying the crash was due to Overton using the ‘summon’ feature which directs the car to park near where the driver is standing. The software, which is in its ‘Beta’ version, can be used to “open your garage door, enter your garage, park itself, and shut down”, and can also operate in reverse when you need to leave.
“Tesla has reviewed the vehicle’s logs, which show that the incident occurred as a result of the driver not being properly attentive to the vehicle’s surroundings while using the Summon feature or maintaining responsibility for safely controlling the vehicle at all times,” the letter signed by a regional service manager read.
“Safety is a top priority at Tesla, and we remain committed to ensuring our cars are among the absolute safest vehicles on today’s roads,” the statement read. “It is paramount that our customers also exercise safe behavior when using our vehicles — including remaining alert and present when using the car’s autonomous features, which can significantly improve our customers’ overall safety as well as enhance their driving experience.”
Overton isn’t convinced, though.
“Even during that 15, 20 second walk right here, we would have easily heard the impact of the vehicle into the back of the trailer,” Overton said. “They can tell me what they want to tell me with the logs, but it doesn’t change what we know happened here. They’re just assuming that I sat there and watched it happen, and I was OK with that.”
“I think it behooves them to figure out what happened, what happened with the vehicle, address it,” Jared Overton said Tuesday. “Just fix it.”
According to Tesla, the vehicle’s sensors might not detect some obstacles that are lower than the fascia or hanging from the ceiling. The company says the summon feature should be used responsibly, namely users should never use it while having the vehicle unsupervised and should standby to stop the vehicle via the mobile app if something goes wrong. “You must maintain control and responsibility for your vehicle when using this feature and should only use it on private property,” a Tesla spokesman said.
As for Tesla’s most important autonomous feature, the “autopilot mode”, there were no accidents reported yet despite the cars have logged millions of miles so far. Moreover, there have been instances in which lives were saved by it, as Ohioan Joshua Brown testified. Two weeks ago, Elon Musk claimed autopilot is already reducing car accidents by 50%.
Like Tesla said earlier, “safety is a top priority” for the company, but what happens if there’s a car crash on the road because of a self-driving feature? Will people remember there’s a net benefit, i.e. autopilot avoids more crashes than it produces? Or will people panic and drive the Tesla by themselves from there on? If there’s a fatal accident involving autopilot on the highway, the press will be all over it. It could even doom the company and set back self-driving cars a decade. It would be a shame too. The Economist reported in August that “if 90% of cars on American roads were autonomous, accidents would fall from 5.5m a year to 1.3m.”
It’s an interesting psychological response, one for which nobody is really prepared given the tech is as alien as flying saucers at the moment.