We recently reported on two influential groups changing the way vehicles are rated by focusing on the safety of those with partially automated driver-assist systems. Now, these scores will incorporate these updates for all new vehicle models, starting in 2022, that utilize such technology. These changes come as prosecutors in California file two counts of vehicular manslaughter against a driver who ran a red light and killed two people while operating a Tesla on Autopilot in 2019.
The charges, which were initially filed in October, seem to be the first involving a felony against a driver using any partially automated driving technology in the United States. The 27-year-old driver, Kevin George Aziz Riad, pleaded not guilty.
Tesla’s autopilot system helps control speed, braking, and steering, and its misuse has been the underlying cause of various crashes and investigations. Although other criminal charges regarding automated driving systems have taken place, this case marks the first charge involving a commonly-used driver assist system.
In the U.S., there are currently 765,000 Tesla vehicles with the Autopilot system implemented onboard–a stark contrast from the tech involved in the charge of negligent homicide in 2020 involving an Uber driver who had been helping to test fully autonomous vehicles on public roads.
The recent Tesla crash involves much more widely-used technology, which is a major concern for transportation safety advocates. In the incident, a Tesla Model S was driving quickly when it exited a freeway, ran a red light in Los Angeles, and hit a Honda Civic at an intersection in Gardena. The driver and passenger in the Civic died immediately, while Riad and his passenger were hospitalized due to their injuries.
The National Highway Traffic Safety Administration had sent investigators to the crash, and in January 2022, confirmed the use of Tesla’s Autopilot system as being utilized at the time of the incident. Riad’s preliminary hearing is scheduled for tomorrow.
Drivers’ overconfidence in driver assist systems has been at the center of many crashes–including deadly ones. Because of this, the National Transportation Safety Board, along with the National Highway Transportation Safety Administration, have been reviewing Autopilot’s misuse and consider it to be “automation complacency.”
“I think we are a lot further away from self-driving cars than tech companies and commercials would like us to believe,” said Levinson and Stefani’s Jay Stefani. “That said, there is a lot of great safety technology out there to assist drivers. Lane drift alarms, blind spot detectors, active forward collision avoidance systems, and back-up cameras and alarms are all examples of technology that has saved lives.”
According to Consumer Reports and the Insurance Institute for Highway Safety, studies have shown drivers often rely too heavily on their automated systems and pay little attention to alerts. Advertising for vehicles with these programs on board typically overexaggerate the systems’ abilities, as well.
Tesla’s autopilot system was launched in 2015, and in 2018, a Tesla driver died in a collision with a freeway barrier in Mountain View, California. According to the NTSB, the driver was allowing the system to operate while he or she played a mobile game.
“Keeping drivers focused on the road and the vehicle is critical for the safe use of partially automated driving systems,” said the president of IIHS, David Harkey.
Drivers often equate partially automated systems with self-driving vehicles, although no self-driving vehicles are available to consumers. When a driver ignores the monitor systems of a vehicle, their driving may be more dangerous than if they had been operating a vehicle without a partially automated system, Harkey noted.
“There are studies that go back probably 80 years that show humans are pretty bad about just watching automation happen,” said Jake Fisher, senior director of Consumer Reports’ Auto Test Center. “It’s just too easy to get bored and let your attention wander.”
Tesla has responded to these issues by working to update its automated driving software to be able to more easily deter misuse, as well as to improve its Autopilot’s capacity to detect emergency vehicles.