Driverless Cars Could Be Hacked With Stickers on Traffic Signs, Study Suggests

Justin Sullivan/Getty Image
Justin Sullivan/Getty Image / Justin Sullivan/Getty Image
facebooktwitterreddit

As driverless cars inch toward becoming regular sights on our streets, experts have started to warn that the connected cars could be vulnerable to hackers who can take control of the vehicles from a distance. Though most of these warnings are related to hacking into the internet-connected computer on board, there’s an analog way to disrupt the workings of a driverless car, too, as Autoblog reports. Researchers from across the U.S. recently figured out how to trick a driverless car with a set of stickers, as they detail in a paper posted on arXiv.org.

They examined how fiddling with the appearance of stop signs could redirect a driverless car, tricking its sensors and cameras into thinking that a stop sign is actually a speed limit sign for a 45 mile-per-hour zone, for instance.

They found that by creating a mask to cover the sign that looks almost identical to the sign itself (so a human wouldn’t necessarily notice the difference), they could fool a road-sign classifier like those used by driverless cars into misreading the sign 100 percent of the time.


Evtimov et al., arXiv.org

In a test of a right-turn sign, a mask that filled in the arrow on the sign resulted in a 100 percent misclassification rate. In two thirds of the trials, the right-turn was misclassified as a stop sign, and in one third, it was misclassified as an added lane sign. Graffiti-like stickers that read “love” and “hate” confused the classifier into reading a stop sign as a speed limit sign the majority of the time, as did an abstract design where just a few block-shaped stickers were placed over the sign.

“We hypothesize that given the similar appearance of warning signs, small perturbations are sufficient to confuse the classifier,” they write.

The study suggests that hackers wouldn’t need much equipment to wreak havoc on a driverless car. If they knew the algorithm of the car’s visual system, they would just need a printer or some stickers to fool the car.

However, the attacks could be foiled if the cars have fail-safes like multiple sensors and take context (like whether the car is driving in a city or on a highway) into account while reading signs, as Autoblog notes.

[h/t Autoblog]