Tech News

Dangerous Accident Causes Tesla’s Autopilot Concerns

[ad_1]

The National Highway Traffic Safety Administration – the state-owned auto safety agency – and the National Transportation Safety Board – an independent body that investigates alarming incidents – have sent teams to Texas to investigate the crash. “We are working with local security agencies and Tesla to find out more about the accident and to take appropriate action when we have more,” NHTSA said in a statement. It may take weeks, or even months, for the research results to be released.

However, the experience also highlights the difference between Tesla’s marketing of its expertise and its true potential, which is reflected in the driving boxes and proprietary records.

There are small video booth companies on the platform like YouTube and TikTok where people try to “fool” Autopilot driving without a dedicated supervisor in the front seat; some movies show people “sleeping” behind or behind the wheel. The owners of Tesla have also shown that, once the driver’s seat belt is secured, someone can drive in Autopilot mode to walk for a few seconds without anyone behind the wheel.

Tesla — especially Musk — has a mixed history of Full Self-Driving and Autopilot public claims. Tesla on Autopilot gives visual and audible warnings to drivers if its sensors do not detect the pressure of their hands on the wheel every 30 seconds, and will stop if they do not hear hands for minutes. But at the time of 60 minutes Appearing in 2018, Musk sat behind the wheel of a Model 3 moving, leaning, and put your hands on her hips. “Now you’re not driving at all,” the anchor said in surprise.

This month, Musk told podcaster Joe Rogan, “I think Autopilot fits well so you won’t have to drive a lot of time unless you really want to.” The CEO has also provided a thorough review of his company’s operations in self-driving. In 2019 he promised that Tesla would have it 1 million robotaxis on the road by the end of 2020. But in the fall of 2020, company representatives wrote to the California department of Motor Vehicles assuring them that Full-Driving is “not going to change much in the future,” and that FSD will remain a “leading role for drivers” rather than an independent one.

To date, FSD has only been awarded to 1,000 or all participants in the Beta testing program. “Be careful, though, it’s growing,” Musk sent last month to those who tested FSD Beta.

At least three people have died in autopilot accidents. After investigating the 2018 catastrophic crash in Mountain View, California, the The NTSB questioned the federal government and Tesla ensuring that drivers are able to use Tesla protection in safe environments. It also urged Tesla to establish a high-performance lighting system, to ensure that drivers are alert on the road. General Motors, for example, will only allow its users SuperCruise machine features working on the streets before. The driver’s viewing camera also detects when a driver’s eyes are targeting a road.

An NHTSA spokesman says the agency has opened an investigation into the damage of 28 Tesla crashes.

Tesla’s data shows that cars are safer than ordinary US cars. Saturday, just hours before the Texas crash kills, Musk he wrote that Autopilot-owned Teslas participants are often less likely to be involved in an accident than a regular car, as measured by government data. But experts say that comparisons are not really appropriate. Autopilot only needs to be used on highways, where government data covers all types of traffic. And Teslas are heavy-duty, high-performance vehicles, which means they are safe from danger.


Many Great Stories



[ad_2]

Source link

Related Articles

Leave a Reply

Back to top button