July 2, 2022

Tesla brakes for no reason.  It's not the only autopilot problem

Tesla brakes for no reason. It’s not the only autopilot problem

Aerial view of Teslas parked at the company's Fremont facility, California

Aerial view of Teslas parked at the company’s Fremont facility, California
picture: Justin Sullivan (Getty Images)

The National Highway Traffic Safety Administration (NHTSA) has released a debt report on Tesla Level 2 driver assistance systems are called Autopilot and fully autonomous driving. More than 750 Tesla owners have reported that their cars have mysteriously stopped on the roads for no apparent reason. While this should be a concern for Tesla, it’s far from the only safety issue semi-autonomous vehicle technology has faced.

Since introducing features with names like “autopilot” and “full autonomous driving,” Tesla has faced criticism for exaggerating the capabilities of what are still just driver assistance systems that still require constant vigilance from the person behind the wheel. However, linguistic concerns are only part of the problem; The foundation of the technology was riddled with bugs that were discovered by ordinary consumers who beta test the software on public roads.

In the latest phantom brake issue, NHTSA has requested more information from Tesla about 750 complaints. From News agency:

In the letter, the NHTSA asks about the initial velocity for the time the cars began to brake, the final velocity, and the average deceleration. It also asks if automated systems have detected a target obstacle, and whether Tesla has video of braking accidents.

The dealership is now seeking information on warranty claims for phantom brakes, including names of owners and repairs that were made. It’s also seeking information about Tesla’s sensors, any tests or investigations into braking problems, or if any modifications have been made.

The letter focuses on Tesla’s testing of automated systems when it comes to detecting metallic bridges, S-curves, oncoming and crossing traffic, and different sizes of vehicles including large trucks. The agency also wants information about how the cameras handle reflections, shadows, glare, and blockage due to snow or heavy rain.

in 2017, the autopilot drove a man into a concrete barrier at 70 mph, a fatal accident; Turns out the driver was using his cell phone and might not have noticed that his Tesla had taken a sharp turn. However, the National Transportation Safety Board found that, in this case, the Tesla Autopilot likely wasn’t even programmed to recognize concrete barriers, and therefore wouldn’t be programmed to stop for one.

The inability to recognize certain things resulted in the deaths of two drivers whose vehicles did not know it stop for tractor trailers. Teslas also didn’t know that Emergency vehicle stop that may have been parked on the side of the road or in a traffic lane, causing at least this to happen 12 incidents reported. When the fully autonomous driving beta was released, the file quality Tesla’s left turns dropped As users continue to test the software; cars too Aim the crowded lanes and scrape against the bush. Consumer Reports Even comparing the FSD Beta to a drunk driver.

Of course we cannot ignore the human element in this case. Had drivers been paying attention, they likely would have recognized the onset of a dangerous situation and were able to perform evasive maneuvers to prevent an accident. After all, drivers Technically They are supposed to have their hands on the steering wheel and their butts in a seat in order to engage Tesla’s driver assistance program.

But like Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University who studies robotic vehicles, He told CBS News: “It is very easy to get past the steering pressure thing. It has been going on since 2014. We have been discussing this for a long time now.” We are in Jalopnik It covered all kinds of ways a driver can add steering wheel pressure without putting their hands on the steering wheel. And that Pressure-gauge It was only added after Tesla called it in; The company initially avoided installing one to save money.

No matter what NHTSA discovers about this new dummy brake canister, the fact that Tesla’s semi-autonomous driver assistance systems constantly face so much scrutiny should be a red flag for the automaker itself, consumers, other drivers and regulators. It should also raise important questions as we continue toward the autonomous vehicle: How much testing is required before a semi-autonomous vehicle hits the road? How many regulations should be required to ensure the safety of these technologies? And why do we use traditional drivers as beta testers for software that baffles everyone from engineers to ethicists?