If we are to believe the hype generated by tech gurus at Tesla, Google, Delphi and MobileEye’s Level 4 system as well as car manufacturers, fully automated cars are going to be hitting the roads in large numbers in the next few years. There is even a limited form of autopilot used by Tesla and others currently. But while this all may be the case, there is the pressing need of refining the yet to be developed artificial intelligence (AI) technology to safely operate these cars. Further to go than people think
As that dream self-driving cars seemingly comes closer to reality, some AI experts are going on record saying that AI will not be able to reliably avoid accidents for years or even decades to come. This painful recalibration of expectations could have disastrous implications for companies that have gone all in on driverless technology (which includes GM with a model out in 2019.
Programmers are still seeing holes in the “deep learning” systems that operate the cars. Current technology is good enough to navigate cars as long as it identifies familiar objects and follows the rules. However, driving safely includes scenarios that are more complicated where unexpected things arise. Last March, for example, a self-driving Uber hit a woman pushing a bike across an unauthorized Crosswalk. The technology first identified her as an unknown object, then a vehicle and finally as a bicycle, adjusting its course each time. There is another story in California where a Model X sped up on the highway and veered into a barrier. There have been no answers as to why this happened.
Tech advocates are walking back some expectations about this technology in different ways. One that is particularly troubling is changing the road safety paradigm: Instead of making the cars safe for the roads, we make the roads safer for the cars. In other words, pedestrians should not assume that cars will see them and stop when they try walk across the street. This turns 100 years of car culture upside down.
The future starts today
As the bugs of these systems are being worked out, there is the reality that these vehicles will likely not be able to properly react in an unusual situation as a human driver would. This will likely lead to many more serious injuries and deaths in the coming years. This will lead to lawsuits by the victims of their families who had no interest in being a part of the technology’s learning curve.