Washington, Apr 26 (EFE).- An investigation by US federal authorities into 467 crashes involving Tesla vehicles found a "critical safety gap" associated to drivers false expectations over the autopilot system, which led to preventable crashes that killed at least 13 people and injured dozens.

The National Highway Traffic Safety Administration's report, released Friday on its website after a three-year investigation, concluded that Tesla's "weak driver engagement system was not appropriate."

According to the report, "Autopilot controls did not sufficiently ensure driver attention and appropriate use," leading to driver overconfidence and disengagement, even in conditions outside of the manufacturer's recommendations, such as wet roadways.

"This mismatch resulted in a critical safety gap" related to expectations of over "the L2 system's operating capabilities and the system's true capabilities," NHTSA noted.

The report comes days after Tesla CEO Elon Musk assured that the company will soon have its vehicles operating autonomously.

Musk went so far as to say Tuesday in a conference call, following the release of the company's first-quarter financial results, that "if somebody doesn't believe Tesla's going to solve autonomy, I think they should not be an investor in the company."

Musk has been selling Tesla as an autonomous vehicle company for years, which has caused problems for the manufacturer.

In 2017, the law firm Hagens Berman announced a class-action lawsuit against Tesla for allegedly misleading consumers about the capabilities of its Autopilot system.

In 2021, Musk went so far as to claim that the company would be "close to Level 5" by the end of the year, referring to the maximum level of autonomy when the vehicle can navigate in all conditions without driver intervention.

That same year, several senators asked the Federal Trade Commission to investigate Tesla's advertising of Autopilot.

"We fear that Tesla's Autopilot and FSD features are not as mature and reliable as the company pitches to the public," said the democratic senators' letter.

Despite its name, Autopilot, Tesla's driver assistance system, is only a Level 2 self-driving system, meaning the car can handle steering and acceleration, but the driver must still be ready to take the wheel.

Other manufacturers have equipped their vehicles with L2 driver assistance systems.

NHTSA said its findings are similar to those of a Tesla internal investigation, that led to a software update in December 2023 to improve the monitoring system in its roughly 2 million vehicles in the US.

The federal agency added that it has opened an investigation to determine whether the new software fixes the problems found, and noted that after the update, there were new accidents in vehicles that had Autopilot activated. EFE

© 2024 EFE News Services (U.S.) Inc., source EFE Ingles