As a police officer used to handing out tickets to speeding deathtraps/cars, it must be weird to pull over a car, walk up to the driver side window, and see no one.
But thanks to Google, that’s exactly what happened to a police officer in Los Altos, CA.
It was awkward.

The officer was actually pulling over one of Google’s driverless vehicles, so technically there is no “driver” to ticket. The car had been pulled over for doing 25mph in a 35mph hour zone.
It may seem annoying, but the cars are capped at a maximum speed of 25mph for safety reasons. Google is doing their best to keep the cars as conservative in their decisions as possible because it avoids accidents and also doesn’t paint a picture of driverless vehicles speeding all over the place.
While the officer did speak with the actual human sitting in the passenger seat, no tickets were issued.
Google responded to the event via Google+ saying, “Like this officer, people sometimes flag us down when they want to know more about our project. After 1.2 million miles of autonomous driving (that’s the human equivalent of 90 years of driving experience), we’re proud to say we’ve never been ticketed!”
Obviously, this particular officer was in a position where he knew that there wouldn’t be anyone sitting in the control seat, since the cars are pretty recognizable, but what happens when society progresses to the point where a majority of cars are driverless?
Even if you don’t think a driverless future is a reality, just dream with me for a second, here. Who is responsible in a car where no actual living person is controlling the decisions?
Technically, the cars might eventually get to the point where no human interaction is necessary.
In a worst case scenario, if a computer or sensor glitches, and causes an accident, who gets the ticket? Human “drivers” might have the option to override, but might also not be paying attention enough to notice. What about when a pedestrian walks out and causes the car to swerve (or it hits the pedestrian) who is at fault? It is the human driver or is it the car itself? For something really bad, does it actually fall to the manufacturer as it would in today’s recalls? How do you insure a driverless car?
This is an entirely new wave of technology, so it can’t be regulated by current laws and probably shouldn’t even be thought of in the parameters of current regulations. This requires an entirely new way of thinking.
These are the questions that lawmakers should be sorting out now. If not, we risk ending up in the same position we’re currently in with drones. The technology has completely surpassed any legal entity, and we’re scrambling to catch up with it.
Right now, Google and other companies are just testing out the technology. They are playing it extra careful–pausing 1.5 seconds after a green light to avoid collisions–and they are granted some reprieve because it’s a novel technology.
Soon there will be more and more autonomous vehicles on the road. It means we’re heading for an extremely awkward period of time when regular cars and autonomous cars are sharing the road. The autonomous vehicles are designed to be extra safe, but regular drivers are often hurried, not paying attention and–in my personal experience–generally terrible at driving.
I certainly don’t have the answers, but make your opinion heard in the poll below.