Self-Driving Car Fatality
April 11, 2018
After the fatal car crash that left pedestrian Elaine Herzberg dead, Arizona’s governor suspended Uber’s ability to test its self-driving vehicles on roads throughout the state. Even though there was a safety driver in the vehicle, recent video shows that the driver may not have had eyes on the road at the time of the accident. This self-driving car fatality raises major questions about whether or not the technology for autonomous vehicles is safe enough to be tested on commuter roadways – or if it may be too risky for public safety.
Uber is just one company that’s been testing self-driving vehicles on American roads. In light of Herzberg’s death, many technology firms and public officials are announcing that it may be time to take a brief hiatus and reassess the safety of self-driving vehicles before conducting more public testing and putting people in harm’s way. As we all know, car crashes can result in permanent and disabling injuries. According to the Association for Safe International Road Travel (ASIRT), more than 37,000 Americans die in road crashes annually, so commuter safety must remain a priority.
Self-driving cars are commonly referred to as robot cars, autonomous cars and automated vehicles. One of their major selling points is that they’re supposed to be able to sense the surrounding environment and safely navigate without any human input or action. Not only did Uber’s self-driving vehicle fail to sense Herzberg’s presence (she was crossing a street while walking with her bicycle), but it appears as though the safety driver did as well.
This latest incident raises several questions like the ones we’ve written about on our blog before. Because autonomous vehicles are relatively new products, some liability questions haven’t yet been answered or even fully understood. For example, is the safety driver responsible for not intervening and taking control to avoid the fatal collision? Is Uber? Or are both parties to blame? At the time being, the National Transportation Safety Board (NTSB) is investigating the Uber vehicle involved and has yet to release an official statement.
This isn’t the first fatality that’s been linked to self-driving cars, but it’s the first one that’s involved a fully autonomous one. A Tesla driver (who’s car was in autopilot mode, where the operator is supposed to be alert and keep their hands on the wheel at all times) was fatally injured in an accident in 2016. Unlike that accident, Uber’s car was fully autonomous, which means the car was self-operated and wasn’t supposed to need any kind of driver intervention. According to some sources (including the auto-parts maker that supplied Uber’s vehicle with safety features), the company may have turned off the safety system before the crash.
Autonomous vehicles are equipped with various collision safety technologies, all of which are intended to prevent accidents like this one from happening. With certain parties alleging that Uber turned this feature off, another legal question that’s being asked is whether or not the company was obligated to enable the technology in all its test vehicles. At this time, it’s not known whether or not the technology was disabled as the story is still developing.
According to ethicists around the country, there are many questions to ask about the safety of self-driving vehicles – some of them have to do with this crash in particular, while others are more general. Some examples of these questions are:
- If a human had been driving, would the crash have been avoided?
- How safe do self-driving cars need to be before they’re on the road?
- How should self-driving vehicles prioritize the safety of others (such as pedestrians, vehicle occupants, other automobiles on the road, etc.)?
Even though there are no complete answers to the ethical questions yet, what’s certain is that autonomous vehicles can be extremely dangerous and are putting innocent bystanders at risk. Legislators and other government representatives may want to consider creating stricter regulations before continuing to allow fully autonomous vehicles to be tested on American roads, especially if their safety technologies have been disabled or they aren’t effective enough.
Although some regulations don’t exist because the rules haven’t yet caught up to the technologies, that’s no excuse for failing to implement ones that keep people safe from catastrophic accidents like these. If self-driving motor vehicles are going to be tested in cities across the country (and eventually become popular on roadways), officials should first ensure that people are adequately protected from preventable injuries and death.
Philadelphia Car Accident Lawyers at Galfand Berger, LLP Helping Represent Victims of Automobile Accidents
If you or a loved one has been injured in a car accident, please contact our Philadelphia car accident lawyers at Galfand Berger. With offices located in Philadelphia, Bethlehem, Lancaster, and Reading, and we serve clients throughout Pennsylvania and New Jersey. To schedule a consultation, call us at 800-222-8792 or complete our online contact form.