Few car accidents have received more national attention in recent years than the spate of accidents involving self-driving cars. These high-profile accidents have varied, whether they are related to Google’s first self-driving car crash in 2016 or the first driver killed with a self-driving vehicle’s “Autopilot” system turned on (Tesla’s Tesla S model sports car in 2016).
Most recently, however, the attention has turned toward self-driving cars within the ride-sharing industry, largely because of the first accident where a pedestrian was killed by a self-driving car in March of 2018. In response to the high-profile accident coverage, Uber took its self-driving cars off the road for approximately four months. In late July of 2018, Uber’s self-driving cars returned to the road in Pittsburgh, albeit in “manual mode” with two safety drivers taking control of the vehicle.
In the four months where Uber’s self-driving cars were off the road, it has also been revealed that 100 self-driving car operators were terminated after the fatal accident in March.
Why Did Uber Terminate 100 Self-Driving Car Monitors?
In the fatal pedestrian accident in Tempe, the investigation revealed that Uber’s safety driver tasked with monitoring the vehicle was watching Hulu when the accident occurred. The safety monitor also allegedly lied to federal investigators, claiming that video evidence of her looking down was part of monitoring the vehicle’s self-driving interface. Instead, authorities now believe that the driver’s actions of repeatedly looking down were due to watching Hulu, which the safety driver’s Hulu account usage seems to confirm.
The night of the crash, the driver’s Hulu account was watching television talent show “The Voice” for 42 minutes, coinciding with the time of the accident. It is unclear whether the driver will be charged for her actions that night, but it is clear that Uber wanted to take quick action in response to the blowback.
Uber Took Quick Action, But Should Those Actions Be Trusted?
By replacing 100 car monitors with 55 technical specialists tasked with improving technical feedback and identifying self-driving car problems, it is clear that Uber has public relations in mind. But, does this move indicate a genuine concern for public safety. There are reasons to be skeptical, especially given the internal correspondence between Uber CEO Travis Kalanick and former head of Uber’s self-driving vehicle program, Anthony Levandowski.
A lawsuit between Uber and Waymo revealed that both Mr. Kalanick and Mr. Levandowski agreed that the company needed to stake “all the shortcuts” they could in order to win the self-driving vehicles competition within their industry. Arguably, based on this stark conversation, it seems Uber has been focused on their own business needs that necessarily require “cheat codes”, in Kalanick’s own words. In effect, it seems that public safety was an afterthought based on this private correspondence.
If you or a loved one has been injured in an Uber accident or especially an Uber accident that involved a self-driving car, know that you are entitled to legal compensation. Our team at Ledger Law will review the facts of your claim to help you maximize that deserved compensation.
Contact us online today for a legal consultation and free case evaluation with a Ledger Law Uber accident lawyer.