It appears that California is going to be the state that helps kick off the reality of having driverless cars eventually on the road. With the Sunshine State recently approving DMV testing of driverless cars this spring, you have to wonder just what the rules are going to be. With a few details released, how will we really react when we end up hopping aboard a driverless car to take us on more than just short excursions to the grocery store?
According to reports, driverless cars from such companies as Google will have to submit a license plate number to California’s DMV this fall in order to qualify in the testing. They’ll also need vehicle identification codes and the usual insurance papers ($5 million worth) since none of those will be made moot in order to use a driverless car. While insurance might get a little more complicated when it comes to who gets the blame if something goes wrong, one new testing rule seems to give some clues.
New DMV rules for driverless cars will reportedly require someone to sit behind the wheel of the car in order to take control of the vehicle if something actually does go wrong. In that regard, driverless cars may technically still have a driver based on the idea that the person owns the car. Even so, what does it say about the faith in driverless cars when a person has to be behind the wheel in order to make sure something doesn’t happen? And can that person stop something from happening in time without getting some kind of blame?
The Legal Problems on Who the “Driver” Is
Another aspect to the DMV rules on driverless cars is that the person behind the wheel has to pass a driving test on their own. They’ll have to know everything about the technology so they’ll know what to do if something goes awry. And if that sounds like we’re living in a HAL 3000 world, then you’re not the only one. You’re also not the only one if you’re already conjuring a nightmare scenario where a driverless car mows down numerous people in a busy metropolis and only stops once the person behind the wheel pulls the plug.
In such a scenario, will the person behind the wheel ultimately get the blame for failure in not shutting the car down in time? Nobody knows yet whether it can be as simple as pressing the brake on your own, or if the artificial intelligence would prevent the person behind the wheel from actually taking over. When the car gets programming to do something, it may end up taking over to a point where you almost have to cut the power supply in order to get it to stop acting out its objective.
This is just part of the reason why the public might be apprehensive about riding in a driverless car in the coming years. With other states already signing on to test them and public use perhaps being tested as early as next year, the legalese over who would be to blame is going to have to be spelled out. That’s because something is inevitably going to go wrong eventually. No matter how well a driverless car is (and Google’s reliability, so far), you have to assume that some scenarios could come up the car isn’t programmed to understand.
What Might Go Wrong?
The scariest scenario for driverless cars is driving in a major city where pedestrians, bicyclists and a million other dangers are moving in all directions. While we’re told that driverless cars will be aware of all those dangers and respond to them, you can see a complacency develop after it succeeds for a year or two. The first thing that goes wrong will place driverless cars into a legal pit that will argue whether they’re really safe or not. In turn, it could end up being as bad as General Motors being grilled in D.C. for knowing their cars had a defect without recalls.
Deciding who’s to blame will also become the debate that we once saw in sci-fi novels about whether the robot or the human is to blame for a problem. Do you blame it on the creator of the driverless car or does the driver have to take part of the scrutiny?
Unless it’s spelled out clearly in legal terms everyone understands, it’s going to be a headline of the future as driverless cars start becoming standard in another couple of years. We may eventually have to allow plenty of space around driverless cars when the chaos of a big city may prove too much even for artificial intelligence.