Self-drive car first death

American safety authorities are investigating what is thought to be the first death caused by a self-driving car.
A Tesla Model S like the one in which Joshua Brown diedA Tesla Model S like the one in which Joshua Brown died
A Tesla Model S like the one in which Joshua Brown died

The National Highway Traffic Safety Administration said it had launched an inquiry into the death of Joshua Brown, who died when his Tesla Model S collided with an articulated lorry.

The manufacturer confirmed the vehicle was operating in its “autopilot” self-driving mode at the time of the accident.

Hide Ad
Hide Ad

The system, the same as demonstrated in the Tesla Model X on Top Gear last week, accelerates, brakes, and keeps the vehicle in lane without any driver input. Using only the indicators, the driver can instruct the car to change lanes and overtake.

Tesla said that the white colour of the truck and a bright sky may have contributed to the system failing to “see” the truck before the collision.

The NHTSA said: “Preliminary reports indicate the vehicle crash occurred when a tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway.”

In a blog post Tesla said: “What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.

Hide Ad
Hide Ad

“The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.

“Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.”

The company emphasised that the autopilot function was an “assist feature” and didn’t absolve drivers from ultimate responsibility.

It added: “It is important to note that Tesla disables autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.

Hide Ad
Hide Ad

“When drivers activate autopilot, the acknowledgment box explains, among other things, that autopilot ‘is an assist feature that requires you to keep your hands on the steering wheel at all times,’ and that ‘you need to maintain control and responsibility for your vehicle’ while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to ‘Always keep your hands on the wheel. Be prepared to take over at any time.’”

And the firm insisted that self-driving features were safe and would get even safer as the technology developed. It said that the crash was the first in 130 million self-driving miles compared with the global average of one fatality for every 60 millions miles.

It added: “As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.”

Nonetheless, the incident could be a setback for car manufacturers who are investing heavily in self-driving features. Alongside the pioneering Tesla, established companies such as BMW, Volvo and Audi are developing their own systems to take over the driving process.

While motorists are increasingly accepting of driver aid functions, fatal crashes, no matter how rare, are likely to have buyers questioning the computer systems.