California has informed Tesla that it is considering stricter regulations for the electric car maker’s driver aids that are currently being tested on public roads, following videos posted online of troubling episodes.
Several clips on YouTube and Twitter show drivers testing the full self-driving beta and suddenly having to regain control of their vehicles to prevent their Tesla from hitting a pole or flipping into the oncoming lane.
Tesla has noted that the tools require active driver supervision, but the California Department of Motor Vehicles said in a letter to the company on Jan. 5 that it is assessing whether the features meet the definition of an autonomous vehicle.
Elon Musk’s car company has recruited some motorists for real-life testing of FSD beta, which is supposed to be able to drive in the city, stop automatically or make turns.
California’s DMV wrote in its letter that it is revising its “classification decision following recent software updates, videos showing dangerous uses of that technology, and open investigations” from US regulators.
“DMV will begin further review of the latest releases, including any extensions to the program and features,” the letter said.
If the RDW decides to classify Tesla’s driver assistance systems as an autonomous vehicle, the rules will become stricter.
For example, Tesla should report any problems it encounters to the agency and identify all drivers testing its new tools.
The company did not respond to a request for comment.
Check out the latest from the Consumer Electronics Show on Gadgets 360, at our CES 2022 hub.