A Tesla driving on autopilot has rear-ended a parked police car in Connecticut, once again bringing alarmed responses regarding the automated technology.
The carmaker’s driver-assistance system, named ‘Autopilot,’ was operating a Tesla Model 3 sedan when it hit a parked police cruiser around 12:40 a.m. on December 7th. The highway-patrol vehicle was stopped with its hazard lights flashing on a major highway outside of Norwalk, Connecticut when the officer was working to assist a disabled SUV.
The driver of the Tesla said he was checking on his dog in the backseat when his vehicle rear-ended the cruiser. He told police the Tesla Model 3’s Autopilot feature had been activated and that he was not facing forward at the time of the accident.
The Model 3 also continued on to collide with the bumper of the disabled Jeep, which the officer was assisting. The rear end of the police cruiser, along with the front end of the Tesla, both sustained “heavy” damage.
The driver was issued a misdemeanor for reckless endangerment and reckless driving, but luckily, no one was injured.
“When operating a vehicle, your full attention is required at all times to ensure safe driving,” said a Facebook post by Connecticut State Police. “Although a number of vehicles have some automated capabilities, there are no vehicles currently for sale that are fully automated or self-driving.”
Tesla’s Autopilot system allows its vehicles to accelerate, steer, and brake automatically while changing lanes or staying within a single lane. Still, the feature needs “active driver supervision,” according to Tesla’s website.
The Saturday morning incident has increased current concerns that Tesla is not working efficiently toward increasing safe usage of the Autopilot feature. Although the company’s vehicle user manuals urge drivers to stay attentive and keep their hands on the steering wheel at all times, Elon Musk himself has often retweeted video clips of hands-free Autopilot driving.
Some Tesla drivers haven even posted videos of drivers appearing asleep at the wheel while Autopilot is in use.
This isn’t the first time Tesla’s Autopilot has been part of a major crash. In three past fatal crashes, including a 2018 deadly crash in Delray Beach, Florida (also with a Model 3), Autopilot was engaged. However, the National Transportation Safety Board is still investigating how much Autopilot actually contributed to that particular crash.
Additionally, another crash last year took place on a Southern California highway when a Tesla Model S ran into the back of a fire truck. Although this crash didn’t cause any injuries, it triggered another U.S. National Transportation Safety Board investigation, raising questions regarding the limitations of driver-assist technology such as this, which often fails to detect stationary objects.
It seems this particular issue has been a problem for some time now, without much improvement. We reported back in 2018 that although semi-automated control in vehicles is meant to protect drivers from risky situations on the road, the Insurance Institute for Highway Safety had released a “Reality Check,” which warned drivers that “cars and trucks with electronic driver assist systems may not see stopped vehicles and could even steer you into a crash if you’re not paying attention.”
This August 2018 warning came after five different systems from Tesla, BMW, Volvo and Mercedes had been tested on both tracks and public roads. During this testing, the highway safety institute found that when testing automatic braking, Tesla’s system in both the Model S and Model 3 were the most dangerous–they were the only models that did not stop in time when tested on a track.
Still, Tesla continues to release its quarterly reports indicating that any driver using Autopilot is safer than one operating without it. Tesla also claims that this driving system reminds drivers that they are always responsible for staying attentive on the road, and that the system prohibits Autopilot use when safety warnings are ignored.
Clearly, this technology still has is limits, and drivers must always keep in mind that any system along the lines of “autopilot” is not self-driving, but is there to assist drivers with their control of the vehicle.
Reader Interactions