Monday, September 07, 2015

Are Self-Driving Cars Fatally Flawed?

I read the following in the Guardian story Hackers can trick self-driving cars into taking evasive action.

Hackers can easily trick self-driving cars into thinking that another car, a wall or a person is in front of them, potentially paralysing it or forcing it to take evasive action.

Automated cars use laser ranging systems, known as lidar, to image the world around them and allow their computer systems to identify and track objects. But a tool similar to a laser pointer and costing less than $60 can be used to confuse lidar...

The following appeared in the IEEE Spectrum story Researcher Hacks Self-driving Car Sensors.

Using such a system, attackers could trick a self-driving car into thinking something is directly ahead of it, thus forcing it to slow down. Or they could overwhelm it with so many spurious signals that the car would not move at all for fear of hitting phantom obstacles...

Petit acknowledges that his attacks are currently limited to one specific unit but says, “The point of my work is not to say that IBEO has a poor product. I don’t think any of the lidar manufacturers have thought about this or tried this.” 

I had the following reactions to these stories.

First, it's entirely possible that self-driving car manufacturers know about this attack model. They might have decided that it's worth producing cars despite the technical vulnerability. For example, there is no defense in WiFi for jamming the RF spectrum. There are also non-RF jamming methods to disrupt WiFi, as detailed here. Nevertheless, WiFi is everywhere, but lives usually don't depend on it.

Second, researcher Jonathan Petit appears to have tested an IBEO Lux lidar unit and not a real self-driving car. We don't know, from the Guardian or IEEE Spectrum articles at least, how a Google self-driving car would handle this attack. Perhaps the vendors have already compensated for it.

Third, these articles may undermine one of the presumed benefits of self-driving cars: that they are supposed to be safer than human drivers. If self-driving car technology is vulnerable to an attack not found in driver-controlled cars, that is a problem.

Fourth, does this attack mean that driver-controlled cars with similar technology are also vulnerable, or will be? Are there corresponding attacks for systems that detect obstacles on the road and trigger the brakes before the driver can physically respond?

Last, these articles demonstrate the differences between safety and security. Safety, in general, is a discipline designed to improve the well-being of people facing natural, environmental, mindless threats. Security, in contrast, is designed to counter intelligent, adaptive adversaries. I am predisposed to believe that self-driving car manufacturers have focused on the safety aspects of their products far more than the security aspects. It's time to address that imbalance.


David Wilburn said...

Similar things could be said about vulnerabilities in human drivers. Aiming a visible spectrum laser pointer at a driver's eyes is likely to cause an accident (or perhaps more accurately, an "on purpose").

Crazy Computer Dad said...

I was thinking exactly the same thing as David here. Many researchers have documented the flaws in visual processing of the human brain. Add in auditory stimulus, and pressure stimulus and you can wreak all kinds of havoc with daily commuters

James said...

@david Wilburn.... its just not the same if "you" crash or someone makes you Crash.... honestly cars should not have systems that connect to a network unless we are talking about radio or something non critical.

Bill said...

@James. I totally agree that cars should not have systems that connect to a network unless it is being used for something non critical. I am starting to think that the automobile industry is getting a little carried away with the technology being used by vehicles. I find it funny that as technology becomes more advanced it also becomes more vulnerable and prone to disaster.

I can only imagine the highway accidents that will happen when logistics companies start transporting goods via self-driving trucks.