close
close

California could make it illegal for Tesla to call its cars ‘fully self-driving’

  • Lawmakers have passed legislation that could force Tesla to stop using the term “fully self-driving” in California.
  • The bill requires the governor’s signature and targets marketing about driver assistance programs.
  • The California DMV has accused Tesla of using misleading marketing techniques to promote its FSD.

Tesla could be forced to stop using the term “fully self-driving” (FSD) in California.

On Tuesday, California lawmakers passed a bill in the Senate that could make it illegal for electric car makers to use the software’s current name. The bill, sponsored by Senate Transportation Committee Chairman Lena Gonzalez (D-Long Beach), now requires the signature of Gov. Gavin Newsom.

A Newsom spokeswoman declined to comment on the bill. Tesla did not respond to a request for comment from insiders before the publication.

While the bill doesn’t target Tesla directly, Gonzalez told the Los Angeles Times he believes other automakers such as Ford, GM and BMW are more acutely aware of the limitations of their technology. He said he was.

The bill does not address safety concerns about the software, it simply targets its marketing. It will also set new standards for automakers when it comes to describing the capabilities of driver assistance technologies. However, it is unclear how the rule will be enforced by the California Department of Motor Vehicles (the DMV will be responsible for taking action against Tesla.

Lawmakers passed the bill just weeks after the state’s DMV accused Tesla of using misleading marketing to promote its Autopilot and FSD software.

Autopilot software acts as the driver assistance feature that keeps Tesla running, allowing the car to automatically steer, accelerate and brake within its lane. FSD, on the other hand, is an optional add-on that allows you to change lanes, stop at traffic lights, and stop. signs. Tesla told drivers that both functions require a licensed driver to operate the vehicle and be ready to take over at any time.

Some regulators are concerned that software marketing could give drivers a false sense of security. Last year, a man was arrested for riding in the back seat while using an FSD on the highway. In June, the National Highway Traffic Safety Administration announced it had expanded its investigation of Autopilot to include the potential role of software in several fatal car accidents.

Gonzalez said she and other legislators embarked on the bill after the state’s DMV lagged behind enacting a rule banning vehicles that aren’t truly self-driving from being advertised as “self-driving.” Her FSD from Tesla is currently classified as a Level 2 Driver Assistance System and is in beta her testing. The software has more than 100,000 subscribers of hers, and Tesla tests the software in real time, allowing the system’s AI to learn from experienced drivers.

“Are you going to wait for another person to be killed in California?” Gonzalez told the publication. “People in California think fully self-driving cars are fully automated, even if they aren’t,” she added.

An investigation into a state accident involving Tesla’s Autopilot is underway, but it’s unclear if a driver died in California while using the FSD.

Tesla first coined the term “fully self-driving” in 2016. Elon Musk has repeatedly said that Tesla will have self-driving cars in 2015. Most recently, in July, Beta said he would pass the test by the end of the year. But FSD’s beta testers repeatedly point out bugs in the software.

Most recently, a Tesla critic launched a viral advertising campaign. The campaign appeared to show that the software failed to recognize a child-sized mannequin on the street and crashed into it. Earlier this month, Musk scolded drivers on his Twitter. This driver shared a video of the software struggling with turns and lane changes.

Do you work for Tesla or own a Tesla electric car? Contact reporter via non-work email at [email protected]

Leave a Comment