The Problem of Self-Driving Cars: Feeling Safe vs Being Safe

Posted on

Car Accident

By now, you have probably heard of Google’s project to develop self-driving cars. You have probably also heard that one of their prototypes caused a car accident on Valentine’s Day. Even though it was only a very minor fender-bender, this crash highlights the essential problem of self-driving cars, and it doesn’t have anything to do with whether or not they will save lives. For most of us, it’s a fundamental question of being safe versus feeling safe.

It is estimated that over 90 percent of traffic accidents are caused by human error. A self-driving car wouldn’t make human errors (the artificial intelligence discussion is for another day), and Google executives are using this fact as their primary argument for the legalization of self-driving cars. Those 90 percent of accidents, they say, could be prevented. Even so, according to a recent survey by AAA, three out of four Americans would feel afraid to ride in a self-driving car. But why?

How Important Is the Feeling of Safety?

All the figures and equations in the world cannot convince a person to feel safe. When you are behind the wheel of a car, you feel in control. You trust your instincts and your skill to help you adapt to situations you might encounter. This knowledge that you are able to act and influence what will happen in an emergency situation makes you feel safe.

A car that moves on its own may be able to judge stopping distances better than a human, but the person inside will no longer have the control over the vehicle that currently makes drivers and their passengers feel safe (especially since Google’s latest model has no wheel and no pedals at all). The Valentine’s Day crash has only confirmed these fears, by showing that the probability calculations that run these cars are not infallible. Would a human driver have been able to prevent the crash? Maybe. There is really no way to know, but the question itself throws the practical viability of self-driving cars into doubt.

It all becomes much more complicated when you consider that future self-driving cars might cause serious injury or even deaths in accidents. If that happens, who is to blame? A car can’t pay victims for their damages, and the “driver” certainly can’t be held accountable for sitting inside. It also seems unlikely that Google will agree to reimburse people for medical bills and property damage every time one of their cars causes a wreck.

Other serious concerns that have been raised about self-driving vehicles include worries that people may be able to hack into any car’s technology and make dangerous changes. Even more distressing is the question of what personal data these cars will collect from passengers and how it will be used. All of these yet unanswered questions contribute to the feeling of unease surrounding these vehicles.

It might be irrational, but the feeling of safety may be the biggest obstacle facing Google and their self-driving cars. If people don’t feel safe inside them, or sharing the road with them, there is no way that voters will pass legislation to make them legal. Still, Google has petitioned the U.S. Department of Transportation to consider instituting federal safety tests for self-driving cars in an effort to legalize them for commercial sale. If they are successful, the future of cars, car accidents and car accident personal injury lawsuits may get very complicated very quickly.

Our personal injury lawyers have years of experience helping families affected by car accidents in Illinois and Missouri. Call for a free consultation.


WaltonTelken Logo Final760

Get in touch with us today to get started with your FREE case review. We’re only a call, click, or short drive away.

  • This field is for validation purposes and should be left unchanged.