Engineers often innovate into the electronic frontier where the law isn’t specifically written yet.
If major carmakers and analysts put the money where their mouth is, most semi-autonomous cars will be available to regular consumers by around 2025. GM, Mercedes-Benz, Audi, Nissan, BMW, and Renault have officially predicted 2020. Google says they’ll release their technology in 2018.
It’s a race. Of course it is. Natural economic and technological forces are at work. Carmakers want to be the first to get autonomous cars—or at least various automated controls—to market. High-end vehicles already offer technologies like parking assistance and blind-spot alerts, all of which will become standard eventually.
The issue is innovation is fast, but lawmaking isn’t. And human nature tends to stay the same. The process of adopting regulations is generally slow, hardly ever proactive, and must clear a path through public discourse. There are social and legal barriers to widespread adoption of autonomous cars.
So, what will it take to get autonomous vehicles — driverless cars, self-driving cars, robo-cars — in widespread use on public roads, legally?
The Speed of Tech vs. Legislation
Nissan’s Eporo Robot Car can move autonomously in groups while avoiding crashing into obstacles.
The companies developing the tech are tapping into nature, sociology, murky legality, and generally mind-bending creativity to develop their self-driving AI algorithms.
- Google realized they had to program a little “aggression” into their self-driving cars at four-way stops. (The car finds the right opportunity to scoot forward to show its intention.) Otherwise, it was too easy for other drivers to take advantage of autonomous vehicles that politely yield to others according to road rules.
- Google self-driving cars are yet to get a ticket.
- Nissan is studying robotic fish and bumblebees to figure out how robo-cars in a network can communicate and stay spatially aware of each other, in order to travel in close packs and avoid collisions.
Cars available today can park themselves, decode road signs, time traffic lights, follow lanes, see objects in 3-D, apply brakes pre-collision….
On the other hand, self-driving cars have been street legal in California since 2012 and, so far, they’ve only made rules that dictate they should have rules ready by 2015.
Four states—Nevada, California, Florida, Michigan—and Washington, D.C. have enacted laws that define “autonomous vehicle” or “autonomous technology”, or allow for testing on open roads, albeit with restrictions. Several more are following suit.
The National Highway Traffic Safety Administration (NHTSA) has gone so far as to define the different levels of vehicle automation:
- Level 0—No Automation—The driver is in control at all times.
- Level 1—Function-Specific Automation—One or more control functions that assist the driver, such as electronic stability control or assistive brakes
- Level 2—Combined Function Automation—At least 2 functions in unison to assist driver, such as adaptive cruise control with lane centering
- Level 3—Limited Self-Driving Automation—Driver is expected to be available for occasional car-determined control, such as in traffic or environmental conditions requiring driver control
- Level 4—Full Self-Driving Automation—The vehicle performs and monitors all functions. The driver provides destination or navigation input, but is not required to be present to control the vehicle.
Nothing addresses everything that could happen, and it won’t, until it happens.
We’re Pretty Sure Now that Self-Driving Cars Are Safer
Lexus’ self-driving car at CES 2013.
Figures range from cutting 1.2 millions car accidents down by half to reducing death, injuries, and property damage by as much as 90 percent.
The vast majority of car accidents is due to inattention or impairment. With hundreds of thousands of miles being logged by its self-driving Prius and Lexus cars (powered by the Google Chauffeur software), Google is currently collecting data that supports their superior safety over human drivers. Computers don’t get distracted. Acceleration and braking are smoother, and they are better able to maintain safe distances from the car in front of them. Automated systems are simply more efficient and consistent in safety.
Where Does Liability Start And End?
Technology changes human behavior. More importantly, when autonomous vehicles become more widespread, human behavior may change—but the same may not hold for human nature. What happens if software errs and someone gets hurt?
- If Jack doesn’t update the software that drives his car, and the car gets into an accident, whose fault is it? Jack or the software vendor?
- What’s to keep parents from letting their kids play in the street, knowing avoidance technology will keep cars from hitting them?
- How do we keep driverless cars from becoming a powerful tool for terrorists? Could a self-driving car be packed full of explosives and deliver itself to a pre-programmed address?
- If networking cars—and a central database of all cars—provides proximity information that keeps everyone safe, does that take precedence over privacy?
- If insurance premiums start to plummet because cars are safer, will insurance companies have to find other insurance products to sell? Perhaps against auto hackers.
At this point, we are still conducting thought experiments to help determine how laws will be made, especially as we explore the long-term implications of robotic transportation. We do know, however, that the lives saved will greatly outweigh the risks, considering the kind of damage human drivers tend toward.
In the end, lawmakers won’t be dictating what goes on the road. The way will be paved by the engineers who are developing the technology.
How Will This Affect Engineering?
Source: Texas Instruments “TI Vision SDK, Optimized Vision Libraries for ADAS Systems”
The goals of car design would change:
- A third of car design is devoted to safety systems; imagine if engineers no longer had to consider bracing cars against impact
- The value of a car would be based on software, such as the ability to find the path with the least traffic/obstacles
- Cars wouldn’t be about acceleration but about their ability to provide smooth, silent rides to work—so people can sleep, work, or have meetings during their rides
If anything goes wrong with self-driving cars, existing product liability laws put the blame on the manufacturer. However, it won’t be too long until innovation and widespread availability of autonomous vehicles starts pushing past the boundaries of existing law.