Nearly 363,000 cars are being recalled in order to fix behavior flaws

Tesla’s driver-assisted systems are not fully self-driving: a lawsuit against the automaker after a fatal collision

NHTSA has been investigating Tesla’s driver-assist technology for several years, focusing specifically over a dozen incidents in which Tesla vehicles equipped with Autopilot crashed into stationary emergency vehicles. That investigation is much more expansive, covering up to 830,000 vehicles.

On Thanksgiving day, traffic was snarled on the freeway as two lanes of traffic were closed for a short time as people traveled to holiday events. Four ambulances were dispatched to the scene.

The National Highway Traffic Safety Administration says in documents posted Thursday that Tesla will fix the concerns with an online software update in the coming weeks.

The US Justice Department sent a request to the company for documents relating to its Autopilot and FSD features.

NHTSA is investigating a different version of Autopilot. That technology combines lane-keeping assist with adaptive cruise control to keep a car in a lane on a highway, as opposed to the promise of “full self-driving,” which Tesla says aims to one day be able to operate a vehicle without human supervision on a city street.

Autopilot has long been controversial. The National Transportation Safety Board previously found that the technology was partially to blame in a fatal crash.

Tesla claims that Autopilot is safer than ordinary driving, but autonomous vehicle experts say the data chosen by Tesla to support its safety claims compares apples and oranges, and isn’t the best measure of the safety of the systems.

Tesla does not appear close to regulatory approval for “full self-driving.” In August of 2022, the California DMV said that the name “full self-driving” is “a deceptive practice” and grounds for suspending or revoking Tesla’s license to sell vehicles in the state.

The public was given a false impression of what driver-assisted systems were capable of when it was decided to stop calling them self-driving.

Concerns About the Tesla and Autopilot Over-The-Air Software Updates and Implications for the National Highway Traffic Safety Agency (NHTSA)

Documents filed regarding the recall (included below) don’t call out specific incidents, but NHTSA’s concerns are listed as focusing on four specific situations that can happen on the road, like navigating intersections during a “stale” yellow light, how long cars stop at a stop sign when the intersection is clear, how they adjust speed while driving in areas where the speed limit is changing based on road signs the car detects and settings put in place by the driver, and how the cars change lanes to get out of a turn-only lane.

The word recall for an over-the-air software update is anachronistic, and it’s flat wrong, according to Musk.

The notice said that the problems are present with all cars with the current version of the FSD software, which is available on all four Tesla models, the Model S, Model X, Model 3 and Model Y.

The key to the company’s basic business plan is that they can attract buyers to their cars in the first place, and the premiums drivers pay for the features. Tesla and Musk have repeatedly claimed that FSD, even in its current “beta” form, is safer than cars driven solely by humans. He told investors last month that he knows about 100 million miles of drivers that use FSD outside of highways.

In a November 28 court filing, the lawyers for the company said that the suit should be thrown out because of their failure to realize a long-term goal.

11 people said that their full self- driving car wasn’t worth $15,000 when interviewed by CNN Business in September. And it’s been the subject of controversy for years, including a recent ad that played during the Super Bowl in a few markets.

The latest request came after the NHTSA requested more information about the Musk tweet, which said drivers should keep their hands on the steering wheel.

February 16th at 2:44PM is a new update. Updated to include more details about Autopilot, FSD, and Tesla’s over-the-air software updates. Also to include a tweet from Elon Musk.

According to the agency’s filing, those include driving through a yellow light on the verge of turning red; not properly stopping at a stop sign; speeding, due to failing to detect a road sign or because the driver has set their car to default to a faster speed; and making unexpected lane changes to move out of turn-only lanes when going straight through an intersection. Drivers will still be able to use the feature when a patch is built.

Humans do not work that way, says Philip Koopman, who studies self-driving car safety as an associate professor at Carnegie Mellon University. “That’s a fundamental issue with this technology: You have a short reaction time to avoid these situations, and people aren’t good at that if they’re trained to think that the car does the right thing,” he says. The human driver needs to take over if the car buzzes and crackles.

The system may not adequately respond to changes in posted speed limits, and may not account for the driver’s adjustments in speed according to the documents.

Model S, MX, MY Autonomy Software for All Models Including the Model X Model and Model Y Theorems

It covers all Model S, Model X, Model 3, and Model Y vehicles that have the software as well as some vehicles that have not yet been installed.

Previous post A new study offers some evidence that better school lunches can reduce health problems
Next post Residents of an Ohio town want to know why a train derailed