Auto, Tech

Autonomous Cars — After Crash, Are They Actually Safe?

There is perhaps no greater symbol of luxury than riding around in a car driven by a chauffeur. But what if, instead of a human driver, the car itself was your chauffeur? That is what auto and tech companies like Uber, Google, and auto manufacturers are trying to develop with autonomous cars. However, a recent fatal accident might halt that effort altogether.

This week the U.S. National Transportation and Safety Bureau released a report on the March 2018 crash of an autonomous car in Arizona. The accident fatally struck a pedestrian, and since then all anyone has wanted to know is: what went wrong? It also leads to a larger discussion about whether or not self-driving cars are safer than those driven by humans.

autonomous, autonomous car

Credit: Automobile Italia, Flickr

What Happened in Arizona?

On the night of March 18, 2018, an autonomous Uber car struck and killed Elaine Herzberg, dressed in dark clothing at the time. The autonomous car struck her as she crossed in the middle of a road with her bicycle.

The NTSB report explains in more detail what happened, depicting the autonomous car’s cognition processes as muddled with confusion. The car uses both radar and laser-based sensors to detect its surroundings, and those systems functioned. However, the data it interpreted did not process correctly, and necessary systems disengaged.

About six seconds before the collision, the car’s detection systems spotted the pedestrian, but they were unable to classify her as a human being. The car’s computer classified the pedestrian first as an unknown object. Then it classified her as a vehicle, and then it finally classified her as a bicycle. For all these different classifications, the computer predicted different travel paths for each. This all took place in a matter of about three seconds.

autonomous, autonomous car

Credit: David Berkowitz, Flickr

What Went Wrong?

At 1.3 seconds from collision, the car determined that the situation necessitated an emergency braking maneuver. However, the system was unable to engage the brakes on its own. The car, a Volvo XC90, could have done this, but Uber disabled that system to, according to the report, “prevent erratic vehicle behavior.” However, the car also was unable to alert the human occupying the driver’s seat.

autonomous, autonomous car

Credit: Don DeBold, Flickr (2014)

About a second before impact, the human driver in the car reacted but didn’t start to brake until after the crash. Police said the driver’s gaze was not on the road, but something “inside the car.” The safety drivers are required to monitor a message screen located in the center of the car. When the driver looked up, it was too late. Possible intoxication of the pedestrian could have slowed her reaction time or prevented her from noticing the car.

What does this mean for autonomous cars?

Naturally, this accident spawned concerned statements from both government officials and Uber itself. They suspended the limited use of their autonomous cars in Arizona. However, Uber plans to restarted limited use of these vehicles in Pittsburgh, Pennsylvania. They previously operated there, but the program ceased after an accident. City government is open to Uber’s test program, however they have rules they want the company to follow.

autonomous, autonomous car

Credit: Smoothgroover22, Flickr (2014)

Autonomous cars are still very new technology, and while many companies work towards perfecting it, mistakes still happen. In the state of California, 34 accidents involving autonomous cars occurred since 2014. The overwhelming majority of these result from human errors. Only four of the accidents happened through the fault of the autonomous cars, according to Axios. Each year, tens of thousands of people die at the hands of human-operated vehicles, and millions face injury. It’s a safe hypothesis that despite the tragic accident in Arizona, autonomous cars are safer than human-operated cars.

Was the Autonomous Car at Fault?

autonomous, autonomous car

Credit: Zombieite, Flickr (2017)

Autonomous cars are still developing technology, but they’ve been in operation for years. None the less tragic, the accident in Arizona is the first fatal one attributed to an autonomous car. The current standard protocol for operating these vehicles places one person the driver’s seat and one person in the passenger seat. The person riding shotgun is meant to monitor the system.

Uber reduced the number of personnel to only the driver in 2017, leaving the driver to monitor the systems and look for warning signals. Ironically, the driver appeared to be checking those messages while the computer decided to pull an emergency braking maneuver. The system, however, wasn’t programmed to alert the driver about this. This leads some to believe that the fault doesn’t lie with the autonomous car. 

Yes, it was in control, but it recognized that the car needed to stop in order to avoid a collision. However, it could neither execute an emergency braking maneuver, nor alert the driver. Remove one of these restrictions, and it is possible that Elaine Herzberg would be alive today. The NTSB has not ruled on the matter, as this is just a preliminary report. Once their investigation is complete, we’ll know more about what or who gets the blame for this.

Autonomous Cars - How Are They Safer?

When we think of modern technology, we often think of its negative side, like how our phones don’t always work. So, if we expect people to put their lives in the hands of a car driven by a computer, they are rightfully leery. However, what’s driving these cars are not smartphones. They are complex and sophisticated systems using radar and laser sensors, along with GPS, to navigate the streets.

These machines are able to do complex calculations in a fraction of the time it would take human drivers to react. The vast majority of the time, they are able to correctly identify obstacles, dangers, and other problems, then avoid them. More simply, these machines aren’t subject to human failings. They obey traffic laws. They don’t drink and drive. They don’t take their eyes off the road to reply to a text message or find an episode of its favorite podcast. They don’t fall asleep behind the wheel from exhaustion.

Still, they aren’t perfect. Systems can fail. Modern autonomous cars still don’t navigate in all bad conditions as well as humans do. However, if society doesn’t lose its nerve to develop this technology, it won’t be long before they will be. 

Autonomous cars will continue to advance.

Experience is the best teacher. If we ever expect to get to the point of having flying cars, handing over some trust to these autonomous vehicles is required. Nonetheless, further research and healthy caution must be utilized as well.

What do you think about self-driving cars? Would you ride in them? Let us know in the comments below and share the story with your friends.

You Might Also Like

Previous Story
Next Story

Leave a Reply