NTSB says driver in fatal Tesla crash was overreliant on the cars Autopilot system

The driver of a Tesla who was killed in a crash that drew worldwide attention last year was too reliant on the cars Autopilot system when he plowed into the side of a tractor-trailer at more than 70 miles per hour, federal investigators concluded Tuesday.

The driver of a Tesla who was killed in a crash that drew worldwide attention last year was too reliant on the car’s “Autopilot” system when he plowed into the side of a tractor-trailer at more than 70 miles per hour, federal investigators concluded Tuesday.

The National Transportation Safety Board said Joshua Brown’s overreliance on the auto­pilot system “permitted his prolonged disengagement from the driving task and his use of automation in ways inconsistent with guidance and warnings from the manufacturer.” The Tesla’s “Autopilot” system functioned as designed in the May 7, 2016 crash. However, the system is meant to augment, not replace the driver, the NTSB said.

“In this crash, Tesla’s system worked as designed, but it was designed to perform limited tasks,” NTSB Chairman Robert Sumwalt said. “The result was a collision that should not have happened. System safeguards were lacking.”

Advertisement

The board said the “operational design” of the vehicle’s auto­pilot encouraged Brown’s overreliance on it. “Drivers must always be prepared to take the wheel or apply the brakes,” Sumwalt said.

The NTSB findings came as a partial exoneration of Tesla and a relief for those working to put autonomous vehicles on the road.

Linking the Tesla crash to the coming generation of fully-autonomous cars fueled public fears of vehicles, surveys found.

"I think it's important to clear up a possible misconception," Sumwalt said. "The automobile involved in the collision was not a self-driving car."

In the aftermath of the crash, Tesla put more stringent limits on hands-off driving, disabling the auto­pilot feature if drivers repeatedly ignore the audible and dashboard warnings.

Among the NTSB recommendations Tuesday, the board said automakers should incorporate similar measures and restrict use on highways with cross traffic.

Advertisement

An NTSB investigator testified Tuesday “collision mitigation systems” do not reliably detect cross traffic. The crash has been documented by at least three teams of investigators, including one from the NTSB, which issued a preliminary report in June.

Brown, 40, a former Navy SEAL, was driving down four-lane highway near Willistown, Fla., on a sunny Saturday afternoon with his Tesla Model S set in auto­pilot mode. The system allows the vehicle to guide itself — using multiple sensors linked to a computer system — like a greatly enhanced cruise control system, and comes with automatic emergency braking designed to avoid frontal collisions.

Two minutes earlier, according to reports, Brown had set the speed at almost 10 miles per hour above the posted speed limit.

At about 4:40 p.m., a 53-foot tractor-trailer loaded with blueberries that had been traveling in the opposite direction turned left toward a side road, blocking the path of Brown’s Tesla.

Advertisement

The Tesla careened under the truck’s trailer, traveled almost 300 feet farther and snapped off a utility pole, spinning around into a front yard about 50 feet away.

The driver of the blueberry truck, Frank Baressi, 62, told the Associated Press Brown was “playing Harry Potter on the TV screen.” The Florida Highway Patrol said a DVD player was found in the Tesla, but two of the NTSB investigators on Tuesday disputed it was being used to watch a video.

“We are quite certain that was not the case,” the NTSB’s Ensar Becic told the board members.

The driver who died in a Tesla crash using Autopilot ignored at least 7 safety warnings

In its preliminary report, the NTSB said Brown had his hands on the wheel for just 25 seconds in the final 37 minutes of his drive. The report said he had received six audible warnings and seven visual dashboard warnings, from the auto­pilot systems telling him to keep his hands on the steering wheel.

Advertisement

The National Highway Traffic Safety Administration joined the NTSB, the highway patrol and Tesla in investigating the crash. NHTSA determined Tesla’s auto­pilot feature was not a fault, and its investigators said Brown never tried to avoid the truck or apply the brakes before the crash.

Most of the headlines in the aftermath of the crash were accurate, but others confused the Tesla with a fully-autonomous vehicle. A British science magazine headlined, "Tesla driver dies in first fatal autonomous car crash in U.S.", while CNN asked "Can we trust driverless cars?" and the headline on CBS was "This fatality could slam the brakes on driverless cars".

There were just the sort of headlines automakers who plan to launch genuine driverless cars dread. The Tesla auto­pilot is a Level 2 system, with Level 5 as the standard for a fully autonomous car.

Advertisement

With a public already skeptical about fully autonomous cars, reaction to the initial mishaps may play a significant role in determining how quickly Americans get comfortable with the new cars.

No defect found in Tesla ‘Autopilot’ system used in deadly Florida crash

Traditional automakers plan to gradually introduce features until the day arrives when they've produced a fully autonomous vehicle. But newcomers to the market, like Waymo, plan to put fully autonomous vehicles on the road from day one. Waymo, which changed its name from Google to develop an independent brand, concluded a vehicle without a steering wheel or pedals was the way to go after discovering its own employees often got distracted when driving autonomous cars equipped with steering wheels.

Anticipating the attention paid to Tuesday’s NTSB hearing, Tesla issued a statement saying NHTSA has found Autopilot can reduce crashes by 40 percent.

Advertisement

“We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology,” the Tesla statement said.

“We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”

Brown's family issued a statement Monday through its lawyer.

"We heard numerous times that the car killed our son. That is simply not the case," the family statement said. "There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car. "

ncG1vNJzZmivp6x7uK3SoaCnn6Sku7G70q1lnKedZLmwr8ClZq2qkZuzqq%2FAp5ucp52iwrW1zaBmp6yjl3q0rdisZJ2qmauys3nIp2SfmaSWuW7AxKyjmmWTp660tIywmKxln6uys77EpaCapqRivK9506GcZpuRp8BurdStpqmhnKTBbr%2FYrKuepV9nfXKDjmlwaGliZIB5sZSfaGxoXW6EdHyMamieb11tf6aAjJ9oaW9mm4OlgpBuaZirpKS%2FunrHraSl

 Share!