Tesla

Questions about the safety of Tesla's ‘Full Self-Driving' system are growing

A series of alarming recent incidents have drawn the attention of federal regulators, who were already investigating Tesla’s automated driving systems because of dozens of crashes that raised safety concerns.

CFOTO/Future Publishing via Getty Images

Illustration: Musk seeks to launch Fully autonomous driving (FSD) software in China, in Suqian, Jiangsu province, China, April 28, 2024.

Three times in the past four months, William Stein, a technology analyst at Truist Securities, has taken Elon Musk up on his invitation to try the latest versions of Tesla’s vaunted “Full Self-Driving” system.

A Tesla equipped with the technology, the company says, can travel from point to point with little human intervention. Yet each time Stein drove one of the cars, he said, the vehicle made unsafe or illegal maneuvers. His most recent test-drive earlier this month, Stein said, left his 16-year-old son, who accompanied him, “terrified.”

Watch NBC6 free wherever you are

>
  WATCH HERE

Stein’s experiences, along with a Seattle-area Tesla crash involving Full Self-Driving that killed a motorcyclist in April, have drawn the attention of federal regulators. They have already been investigating Tesla’s automated driving systems for more than two years because of dozens of crashes that raised safety concerns.

The problems have led people who monitor autonomous vehicles to become more skeptical that Tesla’s automated system will ever be able to operate safely on a widespread scale. Stein says he doubts Tesla is even close to deploying a fleet of autonomous robotaxis by next year as Musk has predicted it will.

Get local news you need to know to start your day with NBC 6's News Headlines newsletter.

>
  SIGN UP

The latest incidents come at a pivotal time for Tesla. Musk has told investors it’s possible that Full Self-Driving will be able to operate more safely than human drivers by the end of this year, if not next year.

And in less than two months, the company is scheduled to unveil a vehicle built expressly to be a robotaxi. For Tesla to put robotaxis on the road, Musk has said the company will show regulators that the system can drive more safely than humans. Under federal rules, the Teslas would have to meet national standards for vehicle safety.

Musk has released data showing miles driven per crash, but only for Tesla's less-sophisticated Autopilot system. Safety experts say the data is invalid because it counts only serious crashes with air bag deployment and doesn't show how often human drivers had to take over to avoid a collision.

Full Self-Driving is being used on public roads by roughly 500,000 Tesla owners — slightly more than one in five Teslas in use today. Most of them paid $8,000 or more for the optional system.

The company has cautioned that cars equipped with the system cannot actually drive themselves and that motorists must be ready at all times to intervene if necessary. Tesla also says it tracks each driver’s behavior and will suspend their ability to use Full Self-Driving if they don't properly monitor the system. Recently, the company began calling the system “Full Self-Driving” (Supervised).

Musk, who has acknowledged that his past predictions for the use of autonomous driving proved too optimistic, in 2019 promised a fleet of autonomous vehicles by the end of 2020. Five years later, many who follow the technology say they doubt it can work across the U.S. as promised.

“It’s not even close, and it’s not going to be next year,” said Michael Brooks, executive director of the Center for Auto Safety.

The car that Stein drove was a Tesla Model 3, which he picked up at a Tesla showroom in Westchester County, north of New York City. The car, Tesla's lowest-price vehicle, was equipped with the latest Full Self-Driving software. Musk says the software now uses artificial intelligence to help control steering and pedals.

During his ride, Stein said, the Tesla felt smooth and more human-like than past versions did. But in a trip of less than 10 miles, he said the car made a left turn from a through lane while running a red light.

“That was stunning," Stein said.

He said he didn't take control of the car because there was little traffic and, at the time, the maneuver didn't seem dangerous. Later, though, the car drove down the middle of a parkway, straddling two lanes that carry traffic in the same direction. This time, Stein said, he intervened.

The latest version of Full Self-Driving, Stein wrote to investors, does not “solve autonomy” as Musk has predicted. Nor does it “appear to approach robotaxi capabilities.” During two earlier test drives he took, in April and July, Stein said Tesla vehicles also surprised him with unsafe moves.

Tesla has not responded to messages seeking a comment.

Stein said that while he thinks Tesla will eventually make money off its driving technology, he doesn't foresee a robotaxi with no driver and a passenger in the back seat in the near future. He predicted it will be significantly delayed or limited in where it can travel.

There's often a significant gap, Stein pointed out, between what Musk says and what is likely to happen.

To be sure, many Tesla fans have posted videos on social media showing their cars driving themselves without humans taking control. Videos, of course, don't show how the system performs over time. Others have posted videos showing dangerous behavior.

Alain Kornhauser, who heads autonomous vehicle studies at Princeton University, said he drove a Tesla borrowed from a friend for two weeks and found that it consistently spotted pedestrians and detected other drivers.

Yet while it performs well most of the time, Kornhauser said he had to take control when the Tesla has made moves that scared him. He warns that Full Self-Driving isn't ready to be left without human supervision in all locations.

“This thing," he said, “is not at a point where it can go anywhere.”

Kornhauser said he does think the system could work autonomously in smaller areas of a city where detailed maps help guide the vehicles. He wonders why Musk doesn't start by offering rides on a smaller scale.

“People could really use the mobility that this could provide,” he said.

For years, experts have warned that Tesla's system of cameras and computers isn't always able to spot objects and determine what they are. Cameras can't always see in bad weather and darkness. Most other autonomous robotaxi companies, such as Alphabet Inc.'s Waymo and General Motors' Cruise, combine cameras with radar and laser sensors.

“If you can't see the world correctly, you can't plan and move and actuate to the world correctly,” said Missy Cummings, a professor of engineering and computing at George Mason University. “Cars can't do it with vision only," she said.

Even those with laser and radar, Cummings said, can't always drive reliably yet, raising safety questions about Waymo and Cruise. (Representatives for Waymo and Cruise declined to comment.)

Phil Koopman, a professor at Carnegie Mellon University who studies autonomous vehicle safety, said it will be many years before autonomous vehicles that operate solely on artificial intelligence will be able to handle all real-world situations.

“Machine learning has no common sense and learns narrowly from a huge number of examples,” Koopman said. “If the computer driver gets into a situation it has not been taught about, it is prone to crashing.”

Last April in Snohomish County, Washington, near Seattle, a Tesla using Full Self-Driving hit and killed a motorcyclist, authorities said. The Tesla driver, who has not yet been charged, told authorities that he was using Full Self-Driving while looking at his phone when the car rear-ended the motorcyclist. The motorcyclist was pronounced dead at the scene, authorities reported.

The agency said it's evaluating information on the fatal crash from Tesla and law enforcement officials. It also says it's aware of Stein’s experience with Full Self-Driving.

NHTSA also noted that it's investigating whether a Tesla recall earlier this year, which was intended to bolster its automated vehicle driver monitoring system, actually succeeded. It also pushed Tesla to recall Full Self-Driving in 2023 because, in “certain rare circumstances,” the agency said, it can disobey some traffic laws, raising the risk of a crash. (The agency declined to say if it has finished evaluating whether the recall accomplished its mission.)

As Tesla electric vehicle sales have faltered for the past several months despite price cuts, Musk has told investors that they should view the company more as a robotics and artificial intelligence business than a car company. Yet Tesla has been working on Full Self-Driving since at least 2015.

“I recommend anyone who doesn’t believe that Tesla will solve vehicle autonomy should not hold Tesla stock,” he said during an earnings conference call last month.

Stein told investors, though, they should determine for themselves whether Full Self-Driving, Tesla's artificial intelligence project “with the most history, that's generating current revenue, and is being used in the real world already, actually works.”

Copyright The Associated Press
Exit mobile version