Recently, the police department in Tempe, Arizona released a disturbing video clip of a collision. A self-driving Uber was operating in “autonomous mode,” when it struck and fatally injured a pedestrian walking across multiple lanes of traffic. The driver was supposed to operate as a backup, but also failed to react to the situation.
The vehicle was part of a test, and the test driver was supposed to ensure that the autonomous vehicle operated safely. It is known that autonomous vehicle technology has not yet been perfected, and it suffers from serious, dangerous limitations. It was the test driver’s job to ensure that the vehicle’s software functioned properly. If an autonomous car’s sensors do not recognize a pedestrian, bicyclist, or another vehicle, or if the software crashes, the test driver is supposed to take control. The driver in the Tempe police’s video appears to be looking out the side window, or looking down, at the time of the crash.
In the video clip, the woman was walking her bicycle across the road, when she was struck by the Volvo XC90. Uber had retrofitted the Volvo with autonomous driving technology. It was also fitted with a camera on the inside and on the outside of the vehicle to help Uber understand what happened if something went wrong, so it could then make the product better and safer. The video of the test driver is a reminder that people can become lulled into a false sense of security when riding in autonomous vehicles. They may not be prepared to intervene when necessary.
The chief of the Tempe police said that the accident would have been difficult for any driver to avoid. It happened at night, when the visibility was poor, and the pedestrian was not using a crosswalk.
Regardless, the SUV’s radar sensors and LIDAR should have easily detective the pedestrian and prevented the crash. Many experts in the field want to know what went wrong, why the vehicle did not “see” the pedestrian and react. The cause of the failure is currently unknown. The pedestrian did not suddenly jump out into the street, she was progressing across multiple lanes of traffic. The autonomous system should have easily detected her.
According to a research director at Gartner’s CIO Research Group, even a standard automatic emergency braking system could have detected this pedestrian and hit the brakes.
Some suspect that this accident may be what is known as an “edge case.” These are situations that autonomous vehicles have not been developed to handle.
The investigation is ongoing. Many remain optimistic that someday autonomous vehicles will be safer than regular cars, because they eliminate the chief cause of accidents: human error and negligence.
If you have been involved in a car accident, it is likely that human negligence played a role. The experienced New Jersey car accident lawyers at Eichen Crutchlow Zaslow, LLP can help you recover the compensation you deserve and get your life back. Call 732-777-0100 or contact us online to schedule a free consultation today. Our offices are conveniently located in Red Bank, Toms River, and Edison, New Jersey and serve clients throughout New Jersey.