Tesla in a fatal crash in California was on Autopilot: Officials | Traffic Companies News
[ad_1]
Tesla, who was involved in a road accident on Southern California Road in the United States last week, was working on Autopilot at the time, authorities said.
The May 5 crash in Fontana, a city 80km (50 miles) east of Los Angeles, is being investigated by the National Highway Traffic Safety Administration (NHTSA). The investigation is the 29th case involving Tesla for which the commission responded.
A 35-year-old boy was killed when his Tesla Model 3 was hit by a half-car that hit the road at 2:30 am local time (09:30 GMT). The driver’s name has not been published in public. A man was critically injured when an electric vehicle hit him while helping a semi-lost driver.
The California Highway Patrol, or CHP, announced Thursday that the vehicle was powered by Tesla’s Autopilot, which has been involved in a number of accidents. The Fontana crash represents the fourth US death toll against Autopilot.
“While the CHP is reluctant to respond to ongoing investigations, the Department is recognizing a growing interest in the accidents involving Tesla vehicles,” the agency said in a statement. “We feel that this provides an opportunity to remind people that driving is a challenging task that requires careful monitoring of the driver.”
Federal security investigations came after the CHP arrested a man who authorities said was in the back seat of Tesla operating this week at Interstate 80 near Oakland with no other driver.
The CHP has not said whether officials have decided whether Tesla in the case of the I-80 is operating on Autopilot, which can drive on its highway and the right distance behind the cars in front of it.
But it seems that Autopilot or “Full Self-Driving” was working to get the driver in the back seat. Tesla allows a number of owners to test their operating systems.
Tesla, who has disrupted its communications department, did not respond Friday to a request for comment. The company states in its owners’ books and page that Autopilot and “Full Self-Driving” are not fully operational and that drivers need to be alert and ready to take action at any time.
Autopilot sometimes has difficulty coping with standing objects and crossing vehicles in front of Teslas.
In two Florida accidents, from 2016 to 2019, used Autopilot-powered vehicles crashed into a tractor, killing the men driving the Teslas. In the 2018 crash in Mountain View, California, Apple engineers driving Autopilot were killed when their Tesla hit a roadblock.
Tesla’s machines, which use cameras, radars and a few sonar, also have the problem of driving emergency vehicles. Teslas have hit several fire trucks and police cars parked on the highways with their automatic lightning strikes.
For example, NHTSA in March sent a team to investigate after Tesla on Autopilot crashed into a Michigan State Police vehicle at Interstate 96 near Lansing. The driver or driver of the 22-year-old Tesla was not injured, police said.
In the aftermath of fatal crashes in Florida and California, the National Transportation Safety Board (NTSB) recommended that Tesla develop a robust system to ensure that drivers listen, as well as reduce the use of Autopilot on highways when they are more efficient. Neither Tesla nor the security agency has taken any action.
In a letter issued on October 1 to the U.S. Department of Transportation, NTSB Chairman Robert Sumwalt urged the department to establish rules for operating the Autopilot operating system, as well as to test autonomous vehicles. The NHTSA relied heavily on volunteer automotive guidelines, based on a hands-on approach that would not interfere with the development of new technologies.
Sumwalt said Tesla is using people who have bought cars to test the “Full Self-Driving” program on public roads without monitoring or reporting.
“Since NHTSA does not set requirements, manufacturers are able to drive cars and test almost anywhere, even if the facility is beyond AV [autonomous vehicle] system obstructions, ”Sumwalt wrote.
He added: “While Tesla has the objection that ‘currently supported services need to be supervised by drivers and not driving,’ the NHTSA-based AV testing system could pose a risk to motorists and other road users.”
The NHTSA, which has the power to regulate autonomous operations and seek to recall if necessary, appears to have received a renewed interest since US President Joe Biden took office.
[ad_2]
Source link