What Do Autopilot Features Say About the Loss of a Full-Self-Driving Autonomous Car? The NHTSA and the California Highway Patrol
California Highway Patrol said in the Dec. 7 report that it could not confirm if “full self-driving” was active at the time of the crash. On Wednesday, a highway patrol spokesman told CNN that it would not decide if full self-driving was active, and that the company would have that information.
The crash occurred about lunchtime on Thanksgiving, snarling traffic on Interstate 80 east of the Bay Bridge as two lanes of traffic were closed for about 90 minutes as many people traveled to holiday events. Four ambulances responded to the scene.
CNN Business was told by the NHTSA that it was gathering additional information from and law enforcement about the Thanksgiving Day crash.
In a January 31 public filing, Musk stated that it received requests from the Department of Justice for documents related to its Autopilot and Forest Serviced features.
The best known of these features is autopilot, which can keep a car within a lane on the road. Some drivers say it reduces fatigue during long road trips. Full Self-Driving allows the vehicle to drive on city streets, but also stop for traffic signals, make turns, and reach a destination. Drivers have been impressed with its abilities but alarmed by its flaws.
Autopilot has long been controversial. The technology was partly to blame in a fatal crash, as was previously found by the National Transportation Safety Board.
Experts say the data chosen by the company is not the best measure of the safety of the systems and that there is more to it than meets the eye.
Despite the wide popularity of the “Full Self-Driving” system, it is still in a small program, intended to be used on city streets. There is no car available for sale that can drive itself.
The term “self-driving” was stopped in January of 2021, because it was being used in a way that was not true.
Does The Dawn Project Paint Tesla as a Fake Baby? A Facebook Campaign to End Tesla’s Self-Driving and Implications for Human Safety
The commercial, which will be aired in Washington, DC, Austin, Tallahassee, Albany, Atlanta and Sacramento does not paint Tesla in the best light. The ad is part of a multimillion dollar advertising campaign by The Dawn Project. Its founder, Dan O’Dowd, is a California tech CEO who has dedicated millions of his own money (and a failed US Senate race) to the cause.
It shows a Tesla Model 3, which allegedly has the Full Self-Driving mode turned on, running over a child-sized dummy on a school crosswalk, and then a fake baby in a stroller, in a series of tests by the Dawn Project. In the ad, the car goes through a series of stop signs and oncoming traffic before reaching the school bus stop.
The Dawn Project wants to make computer-controlled systems safer for humans, and has begun to shoot videos as a testing ground for the alleged design flaws of Musk’s company. The video showing the plowing into the mannequins was published in August. Several test videos involving actual children were taken down by YouTube because of safety risks, asTesla fans posted their own videos in defense.
The post contained 1,760 words and was written to response to the cease- and-desist, as well as the barbs from Musk and theTesla supporters.
O’Dowd is attempting to ban the full self-driving feature of tesla. He is running national ads and posting online videos displaying the possible dangers of Musk’s technology. He also ran an unsuccessful one-issue campaign for the US Senate on the same message.
On December 31, Musk replied to a tweet by @WholeMarsBlog which said “users with more than 10,000 miles on FSD Beta should be given the option to turn off the steering wheel nag.”
O’Dowd’s conflict of interest: Green Hills Software, the Washington Post, and a large Intel-manufacture customer
Green Hills Software is a company run by O’Dowd. According to the Washington Post, some of Musk’s defenders claim that O’dowd has a conflict of interest because one of it’s customers is an Intel-owned company.