After the National Transportation Safety Board blamed Uber Technologies Inc.’s policies for a fatal 2018 crash involving a self-driving vehicle, Uber has promised to prioritize publicizing safety information regarding its self-driving technology efforts.
Uber updated its voluntary safety assessment, sent to the National Highway Traffic Safety Administration on August 28th, with its new pledge. This is the company’s first effort against criticism regarding its autonomous driving program since the NTSB first published its beliefs about the first fatal pedestrian accident involving a self-driving vehicle–one of Uber’s–in Tempe, Arizona in 2018.
“We support the idea of transparency and making the public understand what we do,” said Uber’s Advanced Technologies Group head of safety, Nat Beuse. This new voluntary safety assessment is a “complete update” regarding what Uber originally told regulators in 2018.
The March 18th, 2018 incident involved an inattentive Uber safety operator in an autonomous vehicle that hit and killed a 49-year-old pedestrian crossing a dimly-lit roadway. The vehicle was in self-drive mode while the safety operator sat behind the wheel.
This accident caused nation-wide backlash against self-driving cars, although police say the vehicle was operating for testing purposes only.
The NTSB declared that the driver had failed to act safely while distracted by his or her cellphone, and that Uber was at fault for having subpar safety risk assessment procedures, inadequate vehicle operator oversight, and a lack of any mechanism addressing complacency by operators.
Cell phone distraction or otherwise tuning out has been considered a major issue surrounding AV technology, and more researchers are studying just how much advanced driving systems create worse human drivers. Still, many AV supporters believe any current issues will be solved once self-driving tech improves to the point where, hopefully, car crashes will be eliminated.
In its assessment, Uber points out current “enhancements,” such as a new “Safety Case Framework” it claims will allow for open-sourced peer reviews. Additionally, the company said new internal safety management regulations and an independent Safety and Responsibility Advisory Board will be put in place.
Still, safety advocates have spoken against the Trump Administration’s implementation of voluntary approaches to self-driving vehicle regulation, saying voluntary reports are more like marketing brochures instead of formal regulatory filing submissions.
After Uber’s fatal incident, the legislation being considered in Washington to boost the number of autonomous vehicles manufacturers could produce began being heavily debated. All self-driving car testing was suspended by Uber for four months, and its Arizona driverless testing program was also shut down, causing the layoffs of 300 employees.
Currently, 23 various companies have made public their own self-driving safety assessments, including Apple Inc., Ford Motor Co, General Motors Co., Lyft Inc., Mercedes-Benz AG, Toyota Motor Corp., and Waymo. Uber is one of a few that have begun updating voluntary assessments, Beuse noted.
Advocates of consumer safety are using this example to push for stricter self-driving car regulations and more frequent consumer-focused safety assessment reports. Many have also criticized government agencies for being too lenient on the firms working to improve vehicle autonomy, and on voluntary reporting itself.
“It’s nice that Uber has decided this is the right time to update its so-called report, but a consumer-focused agency would have long ago mandated all driverless vehicle manufacturers regularly submit useful safety details regarding their public road tests,” said Center for Auto Safety executive director, Jason Levine.
The major problem with autonomous vehicle testing is weak federal oversight, said Ensar Becic, an NTSB project manager. He explained that even a regulated, basic self-driving vehicle test, “be it an obstacle course, a perceptual test, or tangible requirements such as testing for miles or adherence to development standards” is not common enough for safe, widespread testing.
The National Highway Transportation Safety Administration’s automated vehicle guidance has just 12 testing safety suggestions, and although the NHTSA encourages AV companies to submit self-assessments regarding these 12 elements, few do. Additionally, the AV guidance has no given metrics for autonomous driving system developers to understand whether or not they have effectively achieved all safety goals related to those 12 areas.
Although NTSB members are glad Uber is currently cooperating with the investigation after the incident’s findings were released late last year, they believe Uber has an overall “ineffective safety culture” that led to the fatal crash.