The Federal Security Agency Expands The Investigation to The Autopilot Tesla system

The top engine safety agency of the federal government is expanding its investigation of the Tesla and Autopilot driver aid system to determine whether the technology has raised safety risks.

The National Road Traffic Safety Administration said that on Thursday it had improved the initial evaluation of autopilot to technical analysis.

Analysis sees whether the autopilot fails to prevent the driver to divert the driver’s attention from the way and take other behaviors that can be predicted and risk when using the system.

“We have called for an autopilot closer for a while,” said Jonathan Adkins, Executive Director of the Governor’s Highway Safety Association, which coordinates the country’s efforts to promote safe driving.

NHTSA said there were 35 accidents, including nine where 14 people were killed while the autopilot turned on. But on Thursday said it did not determine whether the autopilot had a defect that could cause the car to fall during the operation.

Extensive studies cover 830,000 cars sold in the United States. This includes the four Tesla vehicles in models S, X, 3 and Y from 2014 to 2021. The agency will see autopilots, various component systems that handle the steering wheel, braking and other driving assignments, and more. A system called Tesla is completely autonomous driving.

Tesla did not respond to requests for comments about the agency’s steps.

The initial evaluation focused on 11 accidents in which the Tesla vehicle controlled by autopilot collided with an emergency parked and parked. NHTSA said on Thursday that the agency had identified 191 incidents in his review. The body said it happened when the car operates with autopilot, completely autonomous driving, or related functions.

Tesla said that fully autonomous driving software can direct cars on the roads, but not entirely autonomous and drivers need to pay attention. In addition, only a group of limited customers called the “beta” or testing version that has not been fully developed.

In -depth investigations show that NHTSA takes more serious safety problems because there is no protection to prevent drivers from using autopilots in a dangerous way.

Michael Brooks, Executive Director of the Automotive Safety Center, a non -profit consumer advocacy group, said: “This is not a general case of defects.” Maybe not a component of “

Tesla and CEO Elon Musk have been criticized for enlarged autonomous and fully autonomous driving, allowing them to drive cars without driver intervention.

“We must at least change his name,” said Adkins of the Governor of the Highway Safety Association. “These names confuse people and make them think they can do more than the truth.”

The competitive system developed by General Motors and Ford Motor Company uses an infrared camera to track the driver’s eyes carefully and trigger alarm when the driver takes his eyes off the road for more than two to three seconds. Tesla initially did not include the driver’s monitoring system in their vehicle, and then only added a standard camera, which was far more inaccurate than the eye infrared camera tracking.

Tesla requires the driver to use an autopilot only on the divided highway, but the system can be activated on any road with the line in the middle. The system of GM and Ford, known as Super Cruise and Bluecruise, can only operate on the highway.

Autopilot was first introduced to the Tesla model at the end of 2015. It uses a camera and other sensors for steering, acceleration, and braking with a little input from the driver. While the user’s manual requires the driver to keep the steering wheel and watch the road, the previous version of the system requires the driver to take their hands from the steering wheel for more than 5 minutes in certain conditions.

Unlike technology in almost every company that works on self-driving cars, Musk believes that autonomy can only be achieved through cameras that track the environment around them. However, many Tesla’s engineers questioned whether it was safe enough to rely on other without sensor cameras.

Musk regularly recommends the autopilot feature, saying autonomous driving “problem solving,” and predicts that the driver will soon be able to sleep while the car is driving.

The question about the system arose in 2016 when a man in Ohio died when his S model crashed into a tractor trailer on Jalan Raya Florida while activating an autopilot. NHTSA said he was investigating the incident and found no safety damage in the Autopilot in 2017.

Autopilot Tesla System Problems

Safe driving claim. Tesla cars can use computers to handle some aspects of driving, such as changing lanes. However, there are concerns that this driver assistance system called Autopilot is not safe. Next, let’s take a closer look at the case.

Driver and accident assistance. The 2019 accident that killed a college student shows how autopilot and driver distraction can have tragic consequences. In another accident, a Tesla vehicle collided with a truck, killing a 15-year-old boy from California. His family sued the company, claiming that the autopilot system was in part responsible.

federal investigation. Last year, the U.S. Highway Traffic Safety Administration began investigating the role of autopilot in crashes after at least 11 incidents involving Teslas colliding with parked emergency vehicles. In June 2022, the agency authorized to enforce recall requests for new security features announced a significant expansion of the investigation.

safe shortcut? Former Tesla employees said automakers may have compromised safety in designing autopilot driver assistance systems in line with CEO Elon Musk’s vision. Musk alleges that the system does not use additional sensors, but only relies on cameras that track the vehicle’s surroundings. Autonomous vehicle systems from other companies usually follow this approach.

information gap. Lack of reliable data also hampers system integrity assessment. According to a report that Tesla publishes every three months, autopilots cause fewer accidents than others, but the numbers are misleading and do not take into account the fact that autopilots are primarily used for highway driving. This is generally unsafe city driving. market price.

However, the agency issued a 2016 newsletter stating that driver assistance systems that fail to keep drivers engaged “can also pose unreasonable safety risks.” In a separate investigation, the National Transportation Safety Board concluded that autopilot systems “played a key role” in Florida accidents. This was due to the lack of safeguards to prevent misuse while operating as intended.

Tesla faces lawsuits from the families of fatal crash victims, with some customers suing the company over claims of autopilot and fully autonomous driving.

Last year, Musk admitted that developing a self-driving car was harder than he thought.

The NHTSA started an initial evaluation of autopilot in August, focusing on 11 incidents where Tesla, initially using autopilot, collided with a stopped and lit crash vehicle where police, fire engines and other emergency vehicles were stopped. I did. One person was killed and 17 others were injured in this incident.

During the investigation of these incidents, six more vehicles related to emergency vehicles were discovered and one of the original 11 vehicles was excluded from further study.

At the same time, the agency became aware of dozens of other accidents that occurred during autopilot activities not involving emergency vehicles. Of those, the agency first focused on 191, which dropped 85 from further investigation, because they couldn’t get enough information to get a clear picture if autopilot was the main culprit.

In about half of the remaining 106, the NHTSA found evidence that drivers weren’t paying attention on the road. About a quarter of 106 cases occurred on roads where autopilot should not be used.

In engineering analysis, the NHTSA Office of Defects Investigation occasionally acquires the compounds it inspects, identifies defects, and arranges tests to reproduce problems that may arise. In the past, components were disassembled to find faults, often asking manufacturers for detailed data about how components worked, including proprietary information.

This process can take months or even a year or more. NHTSA aims to complete the analysis within one year. If you conclude that there is a safety flaw, you can put pressure on the manufacturer to recall and fix the problem.

In rare cases, the automaker won the court to contest the agency’s conclusions and halt the recall.

Share