Company supports its technology failed to stop collision on same day US traffic safety watchdog opens investigation into crash

A Tesla gondola was driving in “autopilot” mode where reference is gate-crashed into a stopped firetruck in Utah, the company said in a report to police that frequently cast blamed on the driver , not its semi-autonomous driving system.

The confirmation that the vehicle’s technology failed to prevent it from colliding with a stopped objective in its route came the same day that the US National Highway Traffic Safety Administration( NHTSA) announced it was sending a team to investigate the 11 May crash in Utah.

Tesla officials told police on Wednesday that the motorist, who suffered a broken ankle when her Tesla Model S crashed, had turned on the “autosteer” and “cruise control” peculiarities about 80 seconds before the crash and taken her hands off the rotation, Tesla officials told police on Wednesday.

In recent weeks, Elon Musk’s electric car company has faced fresh scrutiny over the safety of its autopilot peculiarity, which is supposed to assist drivers in navigating the road but cannot drive the cars on its own. In answer to innumerable high-profile autopilot crashes, including a fatal collision in California, the CEO has insisted that his technology is safer than traditional automobiles. Yet some experts have warned that the semi-autonomous peculiarities establish motorists a spuriou sense of security, allowing them to become easily disconcerted.

The 28 -year-old Utah driver was looking at her phone before the collision and was given a traffic citation for” failure to keep proper lookout”, police said Wednesday.

Tesla’s report to police said there were more than a dozen the case of an the move taking her hands off the rotate for more than one minute at a time and that she only re-engaged when she was provided visual alarms.

” Drivers are frequently admonished Autopilot facets do not establish Tesla vehicle’ autonomous’ and that the motorist absolutely must remain vigilant with their sees on the road, sides on the rotate and they must be prepared to take any and all action necessary to avoid jeopardies on the road ,” the company wrote.

The Tesla Model S is considered after it stumbled the back of a firetruck in South Jordan, Utah. Photograph: Handout/ Reuters

The car was traveling at 60 km / hour when the gate-crash happened, Tesla said, adding that the move manually pressed the vehicle brake fractions of a second before the collision. The driver’s use of autopilot was ” contrary to proper use”, Tesla said, because she” did not pay attention to the road at all meters” and used the boast on a street with no center median and with intersections controlled under stoplights.

Asked why the autopilot engineering did not prevent the collision, a spokesperson told the Guardian in an email:” Tesla told police that the crash arose because the move did not pay attention to the road at all seasons, and that Autopilot is not intended for use without a amply solicitous motorist .”

The spokesperson also pointed to Tesla’s manual, which counsels its operators that the cruise control feature cannot detect all objects and may not brake or slow down for stationary vehicles and that its” automatic disaster restraint” is not a substitute for drivers maintaining a safe interval from the cars in front of them.

Tesla likewise declined to comment on the NHTSA investigation. In recent weeks, the company has publicly feuded with US investigates over continuing federal inquiries related to autopilot.

Musk also criticized correspondents for writing about the gate-crash in a series of tweets earlier this week.

The first lethal autopilot disintegrate is true in 2016 when Tesla’s software failed to “see” the white side of a tractor-trailer in its track against the background of a white-hot sky.


Please enter your comment!
Please enter your name here