While self-driving vehicles are still in their infancy, just having cars on the road with semi-autonomous qualities is a significant accomplishment for the industry. While hopes and aspirations surrounding this industry are certainly high right now, the present models of fully and semi-autonomous vehicles currently possess design and software flaws that researchers and engineers must resolve. These hindrances primarily pertain to a vehicle’s autonomous navigability and safety features, both of which are features that need to be perfected if this industry wants any hope of going mainstream.
1. Crossing Bridges
As we know, vehicles with autonomous capabilities are programmed with intricate GPS maps so they can be compared to what the physical surroundings the car’s cameras and sensors are seeing. Essentially, this is an autonomous vehicle’s means of confirming what is and isn’t supposed to be present, which helps with avoiding objects and pedestrians. One of the drawbacks with bridges is their lack of environmental, land-based cues like buildings. This distorts the vehicle’s sensory equipment, making it difficult for autonomous cars to place themselves on the road.
As mentioned earlier, driverless vehicles use GPS to position themselves, but this method isn’t nearly as precise without specific man-made structures in their vicinity. The most relatable scenario to this design flaw is like walking a straight line from one end to the other in a massive room, and the lights go out when you’re halfway across. While you don’t see anything, you have a general idea of the direction to continue, but are very susceptible to getting thrown off-course.
2. Lane Identification
The US government has been scrutinized for the country’s lackluster infrastructure, and one of the withering facets of the country’s roads and bridges are poorly painted (and maintained) street lines. This has been problematic for autonomous cars’ sensors and cameras, which rely on bold and properly-oriented street lines to safely drive and switch lanes. This flaw in self-driving cars has even prompted proposals of requiring core infrastructure changes if these vehicles want to successfully traverse our roads.
Fortunately, autonomous vehicles are programmed to react properly at working stoplights. Having said that, the stop-and-go process gets increasingly difficult when a person is directing traffic instead. Roadside construction, accidents, and other hazards are notable situations where traffic lights become obsolete due to the vicinity’s current conditions at the time. Since stoplights are ignored, drivers redirect their attention to the workers or police officers directing traffic.
While this is a seamless transition for human drivers, this level of comprehension isn’t programmed in a self-driving vehicle’s software. Throw in how the setups of these scenes and work zones are never consistent, it makes a driverless vehicle prone to entering countless situations where its programming might not function as smoothly as a human driver normally would. The only viable solution is further improving GPS systems to detect these disturbances on roadways so autonomous vehicles could avoid those areas.
4. Volatile Weather
In bouts of heavy rain and snowfall, the Lidar and camera systems on autonomous vehicles face a lot of interference such as blindness and navigable distortion. Especially in cases of heavy snowfall, self-driving car sensors and cameras can’t visualize the street’s markers and lane dividers (and in some cases has issues identifying land-based surroundings). Automakers are trying to develop systems capable of collecting the necessary data to safely operate autonomous vehicles in bad weather.
Ford, for example, developed a high-resolution 3D map set for self-driving vehicles that contains information on the road including nearby signs, landmarks, and anything on top of street surfaces (similar to topography). This would enable an autonomous vehicle to precisely pinpoint itself, even when it’s unable to identify lane markings. Volatile weather has even affected autonomous vehicles that doesn’t include precipitation. Last year, a driver was killed in a Tesla Model S when the vehicle’s autopilot function failed to distinguish a large white 18-wheeler from its background surroundings. It was theorized the cameras and sensors were distorted by a glare from the sun.
5. Unpredictable Drivers, Pedestrians
Autonomous vehicles are programmed to follow all driving laws. Human drivers (as I’m sure we all know and have done ourselves) break or bend these rules on a regular basis, which could lead to concerning interactions between autonomous and human-controlled cars. Self-driving vehicles for instance, will always travel at or below the speed limit, which many people drive well above.
This could lead to instances of tailgating and road rage between an autonomous vehicle and irate human driver (among other issues). Pedestrians and cyclists can be just as sporadic, since it’s impossible to determine how or when someone will behave in a given situation. Keep in mind self-driving vehicles are only prepared for what’s supposed to happen. People jaywalking cycling or walking on the wrong side of the road, or making unprecedented crossings and turns can all affect an autonomous vehicle’s programming.