In recent years, driverless cars have become a growing phenomenon. From Tesla’s semi-autonomous Autopilot to Uber’s self-driving vehicles, the introduction of cars that are able to control themselves without the aid of a physical person has significantly changed what it means to be a driver on the road. Driverless cars have been an incredible step forward in innovation, but, as it often happens with change, there are always issues brought to light as well.
Recently, there was a Chicago Tribune article about a Tesla in Autopilot mode that sped up before crashing into a stopped firetruck. The accident caused injuries to two people, the driver of the Tesla and the driver of the firetruck. Police suggested that the Tesla had most likely been following another vehicle, dropping its speed to 55 mph to match the leading vehicle, which probably then changed lanes, resulting in the Tesla automatically speeding up to its original preset speed of 60 mph without registering the cars that were stopped ahead and crashing into the firetruck.
According to the police report, the driver of the Tesla informed the police that she had been driving under the impression that the Tesla’s automatic emergency braking system would be able to detect the traffic and stop before it hit another vehicle. The driver had owned the car for two years and used the semi-autonomous Autopilot feature on all types of roadways, including the Utah highway she had crashed on. The police said that the car data showed that the driver of the Tesla had not touched the steering wheel for 80 seconds before the crash because she had been looking for her phone and comparing different routes to her destination. When asked about the accident, Tesla pointed out that “drivers are continually reminded to keep their hands on the wheel and maintain control of their vehicle at all times while using the Autopilot system.”
In another accident concerning Uber and its self-driving vehicles, a woman was hit by one of the robotic vehicles while she was crossing a darkened street in Arizona, resulting in her being killed. This was the first death that had involved a fully autonomous vehicle, raising concerns about the safety of computer-controlled cars that are being built by Uber and several other companies. Although it is considered a fully autonomous vehicle, Uber hires safety drivers to try and prevent any accidents such as the one that occurred in Arizona. From a video released by Tempe police, it is shown that the Uber driver was repeatedly looking down before hitting the woman and neither the car’s system nor the driver managed to stop the Uber in time. The safety driver is supposed to take control of the car when its sensors and algorithms are unable to register what the circumstances are.
These types of accidents have led to a growing concern as to where to draw the line when it comes to just how much control to give driverless cars. In the two incidents described above, it is clear that the drivers were a good deal at fault, but they also highlight increased worry that comes with more and more driverless cars being put on the road. Even a slight flaw in those programs or some unforeseeable circumstances could have a severe, potentially life-threatening impact on a person. It’s imperative that drivers, even those sitting behind the wheel of a driverless car, should always err on the side of caution and be safe. Assuming that the car, without a driver’s aid, can control itself and adjust to any situation is a risk not worth taking.
Reader Interactions