We fall asleep; we drive drunk; we get distracted; sometimes we are simply bad at driving, and the consequences are both costly and deadly. A little more than a million people die every year on the roads around the world, and the move to autonomous commercial trucking alone could cut transportation costs for some companies in half.
Yet, the public is not convinced, and they become more skeptical with each report of an accident involving a self-driving car.
Edge Cases: The Achilles Heel of Self-Driving Cars?
Whether it is fair or not, the burden of demonstrating autonomous vehicle safety is on those advocating for self-driving vehicle technology. In order to do this, companies must work to identify and address those edge cases that can cause high-profile accidents that reduce public confidence in the otherwise safe technology.
What happens when a vehicle is driving down the road and it spots a weather-beaten, bent, misshapen, faded stop sign? Though an obviously rare situation---transportation departments would have likely removed such a sign long before it got to this awful state---edge cases are exactly this kind of situation.
An edge case is a low-probability event that should not happen but does happen in the real world---exactly the kinds of cases that programmers and machine learning processes might not consider.
In a real-world scenario, the autonomous vehicle might detect the sign and have no idea that it’s a stop sign. It doesn’t treat it as such and could decide to proceed through the intersection at speed and cause an accident.
A human driver may have a hard time identifying the stop sign too, but that is much less likely for experienced drivers. We know what a stop sign is, and if it’s in anything other than complete ruin, we’ll know to stop at the intersection rather than proceed through it.
This kind of situation is exactly what researchers at MIT and Microsoft have come together to identify and solve, which could improve autonomous vehicle safety and, hopefully, reduce the kinds of accidents that might slow or prevent the adoption of autonomous vehicles on our roads.