- This means every incident involving these vehicles, even when controlled manually, will be public knowledge
- The state’s DMV is concerned about reports of dangerous behavior of the system
- Tesla maintains its “Full Self-Driving” is not, in fact fully autonomous and shouldn’t be treated as such
The California State DMV is now looking into the possibility of treating Tesla cars equipped with the automaker’s “Full Self-Driving” (FSD) as fully autonomous vehicles.
This would force Tesla to disclose every crash that happens on public roads with these vehicles, even when the self-driving system is deactivated, as is the norm with other fully autonomous vehicles.
This decision was prompted in part by the release of a beta version of FSD to selected owners, many of which live in California. Videos of this system committing some serious driving mistakes have been posted to Youtube and reports of erratic behavior have been made,
In addition, a software update caused “phantom braking” a condition that saw many Tesla vehicles perform emergency braking maneuvers unexpectedly due to the system’s cameras picking up imaginary obstacles.
To the automaker’s credit, the bulk of this problem was solved in another update that was released the day after the first reports of this issue.
Nevertheless, the California government wants to take a closer look at the company’s testing practices and the actual danger they pose to people on the state’s streets.
Let’s not forget the federal government is also investigating Tesla in a probe that was launched following multiple reports of Tesla vehicles under the control of the driver assistance feature crashing into emergency response vehicles parked on the side of the road.
Despite what its name implies, Tesla maintains that “Full Self-Driving” a feature available on every Tesla for a price that will soon increase to $12,000 in the US, is not capable of autonomous driving and requires a fully attentive driver at all times.