A World Without Drivers | Teen Ink

A World Without Drivers

May 28, 2014
By EvanWildenberg SILVER, Cannon Falls, Minnesota
EvanWildenberg SILVER, Cannon Falls, Minnesota
7 articles 0 photos 0 comments

Imagine a busy highway, filled with cars. They switch lanes, take exits, and otherwise behave like normal cars. But normal cars they are not. Some of them have no one in the drivers seat, and the vehicles have many sensors mounted on top. These are robotic cars. This is a vision that many people and companies hope for. I am not sure that mankind is ready for this vision. Driverless cars have many issues to deal with still. I hope to highlight some issues with the way self-driving cars work physically, as well as present the legal problems and ethical dilemmas.
According to The Atlantic magazine, Google’s driverless cars work by using a very detailed map of the roads. This is a very complex digital map, with everything down to the height of the curb recorded. This can pose a problem to widespread use, as the area must be mapped before the car can drive in that area. Another issue, highlighted in a National Post article on May 21,2013, is a problem with the sensors. In California, where Google’s cars are currently in use, the sensors work perfectly. But when used on roads where snow and sand are present, the camera and the radar can become covered, making the camera and the radar useless. The possibility of hacking is another vulnerability shown in the GPS World magazine. Information from the sensors could be personal, and would be disastrous in the wrong hands. Of course, the issues with self-driving cars are not confined to physical problems.
Robot cars still have several issues to be resolved before they can be handled properly by America’s legal system. Currently, as stated in The Atlantic magazine, if the car was to make a mistake, the person behind the wheel would be responsible. For example, if a man was reading a book in the driver’s seat of his driverless car, and the car got pulled over for speeding, the man would get fined for the cars actions. This may not the person responsible. Although, according to California law, the man would not be able to read a book behind the wheel anyway. A National Post article shows that an attentive driver must always be behind the wheel, ready to take control. Actually driving the vehicle might be less tedious. Another problem, shown in The New York Times, is that machines can’t be charged with crimes. If a robot car was to hit and kill a person, who would be held responsible? On the other hand, the same article also shows how the information and footage recorded by the cars can help assign blame in lawsuits. Besides legal issues, the cars must also make some difficult decisions.
When people are driving, they don’t need to think about the rare scenarios that could happen. But, according to an Atlantic article, programmers of driverless vehicles need to provide guidelines for every possible situation. They could be general guidelines, but the programmers need to think about tough ethical problems. When people are driving, they often violate a few small laws. An example the Atlantic article gives is a small branch sticking out over a highway. If there is no oncoming traffic, most people would simply go around it in the other lane. A robot car, on the other hand, would observe the laws prohibiting it from crossing a double-yellow line. It might stop, avoiding the branch, but possibly causing other cars to crash into it. Also, there are no-win scenarios to consider. Philippa Foot’s trolley problem is a good example of this type of dilemma. Suppose there was a trolley that could not stop. There are some workmen on the track. The driver has the choice to continue on the same track, which has five people on it, or switch to a track with one person on it. Statistically, it is better to kill one person than five, but ethically, killing anybody is wrong. This scenario applies to self-driving cars in situations where the car must crash into someone. If it is a choice between a motorcyclist with a helmet and on without, which should the car hit? If it hits the one with the helmet, he is more likely to survive, but he is essentially being punished for being safe. If the car simply chooses a random direction, the company may be blamed for not taking advantage of the information it gathered. There are many difficult problems to still figure out.
The issues to be solved are not only mechanical, but are related to legal and even ethical problems. While this vision may eventually become reality, it won’t be any time soon. Until then, people will have to handle the driving themselves.



Similar Articles

JOIN THE DISCUSSION

This article has 0 comments.