For years, Tesla has tested autonomous vehicle technology on public roads without reporting crashes and system malfunctions to the California Department of Motor Vehicles, and other robot car developers are also required to do so under DMV regulations.
But in the face of dozens of viral videos showing Tesla’s experimental fully self-driving technology driving into dangerous situations, and a concerned letter from one of the state’s key lawmakers, the DMV now says it is reviewing Tesla’s behavior and reassessing its own policies.
The agency told Tesla on January 5 that it was “reconsidering” its opinion that the company’s testing program did not fall under the department’s autonomous vehicle regulations because it required a human driver.
“Recent software updates and videos showing the dangerous use of that technology, open investigations by the National Highway Traffic Safety Administration, and the opinions of other experts in the field” prompted a reassessment, DMV said in a letter Monday to Senator Lena Gonzalez. (D-Long Beach), chair of the Senate Transportation Committee.
Concerned about public safety, Gonzalez asked the DMV in December to contend with Tesla’s full pilot self-driving program, under which Tesla owners oversee cars programmed to navigate autonomously on highways, city streets and neighborhood roads, stop at traffic lights and stop Banners as well as turn left and right into traffic.
These are the same features being tested by other bot car developers who report crashes and decouples to the DMV, a group that includes Waymo, Cruise, Argo, and Zoox. Although their cars sometimes break down, there are quite a few YouTube videos that show them behaving dangerously.
Unlike other companies, Tesla operates without trained drivers for testing. Participants in the full self-driving beta paid $10,000 for the privilege – soon to be raised to $12,000.
If the DMV requires Tesla to comply with DMV’s self-test safety regulations, the company will have to report malfunctions and system failures, giving the public the hard data needed to assess how safe or dangerous the technology is. It will also further tighten the requirements for the test driver.
So far, the DMV has not asked Tesla to report malfunctions and disengagements – situations in which the robot’s software transfers control to the test driver. The agency said it considers the fully self-driving system a “Level 2” driver assistance system, similar to systems from other automakers that include lane keeping, adaptive cruise control and automatic lane change.
Fully self-driving Tesla test vehicles are available all over YouTube and Twitter in oncoming movement, choosing to drive on railroads instead of a street and Aiming at metal poles and traffic barriers. Tesla mistook the moon’s autopilot for a yellow light. Tesla recently added a feature to the technology called “Resolute” mode that allows for “rolling stops” at stop signs.
In her December letter, Gonzalez asked the DMV to rate the full autonomous driving pilot trial underway and whether it poses a public risk.
In its response, the DMV cited a full autonomous driving demo that took place over a year ago, in November 2020: “The vehicle could not safely complete a driving task alone” and was unable to “recognize or respond to” stationary objects, road debris, and other vehicles Emergencies, construction areas, large uncontrolled intersections, adverse weather, complex vehicles in the driving lane, and unmarked roads.”
Gonzales’ office said the senator is reviewing the DMV’s response letter and will speak with The Times in the near future.
The Times requested months ago an interview with DMV chief Steve Gordon, but he consistently declined, as he did, through his spokeswoman, on Tuesday.
The agency has in the past referred to California law to defend its current approach. California’s laws on autonomous vehicle technology use definitions drawn from a document published by the Society of Automotive Engineers, which divides vehicle automation into six levels, from Level 0 to Level 5.
The DMV said it considers fully autonomous driving level 2, because, according to Tesla, it requires a human driver to ensure safety. That also applies to testing cars from other robotaxi companies that develop Level 4 vehicles, said Phil Koopman, the Carnegie Mellon professor of engineering who helped SAE with its standard setting documents.
“The DMV concludes that the FSD is not a robotic vehicle because the human driver must be watched to intervene. This is the description that would fit any AV test vehicle with a safe driver, an FSD,” Koopman said in an email to The Times.
He noted that trained safety drivers are expected to prevent robot cars from making mistakes that could put other road users at risk. “These driving errors include traffic law violations such as running red lights and running stop lights that should have been stopped by a responsible test driver but were not stopped,” he said. A recent YouTube video shows a woman driving an FSD beta allowing the car to turn on the red light.
He added that DMV would find it difficult to answer Gonzalez’s question about public risk because it lacked data – the kind of data that DMV collects from other robot car developers. DMV “doesn’t actually address this question, probably because they don’t have test data for FSD. But the reason they don’t have data is because they let Tesla get away with claiming Level 2, thus evading regulatory oversight.”
In its letter to Gonzalez, the agency cited deficiencies in the FSD beta technology to justify its Tier 2 designation, including its inability to recognize basic objects such as road debris, or identify emergency vehicles.
Legal advantages aside, [claim that] “The technology was too bad to be autonomous, so that’s just not a reassuring statement,” said Bryant Walker Smith, an autonomous vehicle law specialist at the University of South Carolina.
Koopman noted that the DMV response letter does not mention a critical element of the SAE standards document, known as J3016 and included as an identifying document in the California Autonomous Vehicle Code. This is “the intent of the design”. The SAE document states that it is “incorrect” to designate a Level 4 design feature as Level 2 “just because on-road testing requires a test driver to supervise the feature while it is engaged, intervening if necessary to keep it operational.”
Tesla CEO Elon Musk explained his design intent. For years, he’s been promising that fully self-driving technology will produce an autonomous, automated hub that owners can rent out when they’re not using it to make extra cash. Promises are nowhere near being fulfilled, and many driverless experts believe YouTube videos show just how far fully autonomous driving really is.
The DMV did not provide any details about what a “re-visit” would entail, or how long it would take to reach a result.
DMV regulations prevent a company from selling technology as independent when it is not. The agency announced a “review” of Tesla’s potential violation of that regulation last May. Eight months later, the review is still “in progress,” according to the DMV.