The chairman of the U.S. National Transportation Safety Board (NTSB) concluded on Tuesday that operational limitations” in the Tesla Model S played “a major role” in a May 2016 crash that ended with the death of a driver that was using the car’s semi-autonomous Autopilot system.

The NTSB said the Autopilot system had performed as intended but lacked safeguards to prevent drivers from misusing it.

tesla-autopilot
Tesla’s electric vehicles have been under public scrutiny after several fatal car crashes involving its autopilot function. Image Credit: BidnessEtc

Robert L. Sumwalt, the chairman of the NTSB, said at a meeting Tuesday that human error and the “lack of sufficient system controls” resulted in a fatal crash that should not have happened.

NTSB says Tesla could have done more to monitor driver’s attention

The NTSB said other limits on the system included Tesla being unable to ensure that drivers used Autopilot only on certain roads, and Tesla failing to monitor driver engagement. The agency recommended auto safety regulators and vehicle manufacturers to take steps to ensure that semi-autonomous systems are not used improperly.

“System safeguards were missing,” said Sumwalt, according to Reuters. “Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention.”

Tesla released a statement saying that Autopilot “significantly increases safety,” and cited a government study that suggested the system reduced the incidence of collisions. However, the Elon Musk-owned company said it would evaluate the NTSB’s suggestions.

The company added it will continue to be “extremely clear” with current or potential customers that Autopilot is not a fully self-driving technology and that drivers still need to remain attentive at all times.

The accident occurred on May last year when 40-year-old Joshua Brown was killed near Williston, Florida after his Model S crashed into a truck while using the “Autopilot” mode. Brown was a known Tesla fan, who had even posted a video using the “Autopilot” feature on his semi-autonomous vehicle.

‘Autopilot’ feature worked as intended

The accident raised questions about the safety of semi-autonomous systems that can carry out driving tasks for prolonged periods of time with little or no human intervention. However, semi-autonomous cars still need some human intervention, as they aren’t completely automatic.

In January, the National Highway Traffic Safety Administration’s report on the accident said that Tesla’s semi-autonomous vehicles needed to be recalled. That report had focused on finding whether the crash had occurred because of flaws of the car, but officials said they found none.

The NTSB on Tuesday said the self-driving system’s “operational design” had played a role in the 2016 crash because it allows drivers to stop steering the wheel or watching the road for extended periods of time that were “inconsistent” with warnings from Tesla.

The Tesla crash scene in Florida earlier this year. Image Credit: NDTV
The Tesla crash scene in Florida last year. Image Credit: NDTV

The agency said that Tesla could have taken more measures to ensure that drivers would understand that the vehicles weren’t completely automatic. The NTSB noted the “Autopilot” had worked as intended but did not do enough to ensure Brown paid adequate attention to the road. Moreover, Tesla drivers could use “Autopilot” on some roads at up to 90 miles per hour, according to the NTSB spokesman.

The agency added that Tesla did not ensure that the system was used only on highways and limited-access roads, as recommended in the owner’s manual. It recommended that automakers monitor driver attention in several ways, not just through steering-wheel engagement. According to the NTSB, the system could not completely detect cross traffic, and “did little to constrain the use of autopilot to roadways for which it was designed.”

Brown family say Tesla car is not to blame for the accident

The NTSB board stressed that monitoring driver attention by measuring steering-wheel engagement was “a poor surrogate for monitored driving engagement.” In June 2016, following the accident, Tesla said in a statement that its “Autopilot” system is not perfect and still requires the driver to remain alert on the road.

At a Tuesday hearing over the accident, NTSB officials said the truck driver and Brown had at least 10 seconds to respond to each other. Meanwhile, Brown’s family said on Monday the Tesla Model S was not to blame for the collision.

“We heard numerous times that the car killed our son. That is simply not the case,” said the family’s statement, according to Reuters. “There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car. People die every day in car accidents. Change always comes with risks, and zero tolerance for deaths would totally stop innovation and improvements.”

The NHTSA said it would review the findings of the NTSB safety board.

Source: Reuters