Tesla's Autopilot Blamed For Freeway Accident

Share

A Tesla Model S electric auto crashed into the back of a fire engine in California after its driver apparently let the vehicle's Autopilot system take over.

Tesla's vaunted Autopilot feature may have been the cause of an accident in Culver City, California earlier this week, according to a report from by CBS Los Angeles. A year later, NHTSA issued a report holding the driver at fault, noting that the autopilot feature is not meant to pilot the vehicle without the driver's full participation. "Amazingly there were no injuries!" according to the tweet.

Tesla essentially passed the buck in a prepared statement addressing the matter, saying the self-driving system is "intended for use only with a fully attentive driver".

Tesla, for example, warns that its autopilot system is not fully autonomous.

Autopilot requires a driver to periodically touch the steering wheel to prove they are paying attention.

Despite accidents that get national news coverage, AAA says its latest survey of drivers shows a growing acceptance of technology that takes over the driving chores. If the driver never responds, the vehicle will gradually slow down until it stops and the flashing hazard lights will come on. A subsequent investigation by the US National Transportation Safety Board found the driver was speeding and had been warned by the vehicle six times to keep his hands on the wheel.

The US backpedals on new Kurdish force as Turkey prepares for war
The Turkish prime minister, Binali Yildirim, meanwhile said "America's conflicting statements shows their confusion about the region".

The NTSB previously investigated Tesla after a Model S crash in 2016 in which the driver died.

In a post on Twitter, the federal safety agency said the "field investigation" would examine both driver and vehicle factors in Monday's accident.

Tesla made design changes to its "Autopilot" system following the crash. Officers said his blood alcohol level was more than twice the legal limit.

According to the police, the idiot tried to claim that he was OK to risk his own life, and that of other drivers, by driving drunk because his Tesla was on Autopilot.

The Tesla driver's pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations.

Share