BGU developing system to stop self-driving cars being fooled

The researchers explained that advanced driving assistance systems consider depthless projections of objects as real objects.

 (photo credit: SHUTTERSTOCK)
(photo credit: SHUTTERSTOCK)
Researchers at Ben-Gurion University of the Negev’s Cyber Security Research Center have found that a phantom image projected on the road in front of a semi-autonomous vehicle can cause its autopilot to brake suddenly, endangering the lives of drivers and passengers.
“Autopilots and advanced driving-assistance systems (ADASs) in semi-autonomous or fully autonomous cars consider depth-less projections of objects (phantoms) real objects,” the researchers said in a statement released Tuesday by BGU.
In a bid to detect phantoms, the researchers are developing a “neural network that examines and detects an object’s reflected light, surface and context and decides whether the detected object is a real object or phantom.”
“This is done purely based on video camera’s output,” Ben Nassi, a PhD student at the Cyber Security Research Center, told The Jerusalem Post.
Nassi is a student of BGU Cyber Security Research Center and Deutsche Telekom Innovation Labs@BGU director Prof. Yuval Elovici, whose team has been “able to train a model that accurately detects phantoms.”
During their research the team demonstrated how attackers can exploit this perceptual challenge and manipulate a car into endangering its passengers as part of the research project, which they dubbed “Phantom of the ADAS.”
During their research, the group also demonstrated that attackers “can fool a driving assistance system into believing fake road signs are real by disguising phantoms for 125 milliseconds in advertisements presented on digital billboards located near roads.”
With the deployment of semi/fully autonomous cars already taking place in countries around the world, “the deployment of vehicular communication systems is delayed.
Advertisement
“Vehicular communication systems connect the car with other cars, pedestrians and surrounding infrastructure,” BGU said in a statement. “The lack of such systems creates a ‘validation gap,’ which prevents semi/fully autonomous vehicles from validating their virtual perception with a third party, requiring them to rely solely on their sensors.”

Stay updated with the latest news!

Subscribe to The Jerusalem Post Newsletter


“This is an ongoing research project” that has taken them almost a year, Nassi said.
“We were inspired by the fact that cars cannot validate their virtual perception with current infrastructure due to the absence of deployed vehicular communication protocols,” he said. “This is not a bug. This is not the result of poor code implementation. This is a fundamental flaw in object detectors that essentially use feature matching for detecting visual objects and were not trained to distinguish between real and fake objects.
“This type of attack is currently not taken into consideration by the automobile industry.”
Nassi said the “attackers are people with the interest to endanger a car by creating an accident using a phantom projection.”
Phantom attacks “can be applied remotely using a drone equipped with a portable projector or by hacking digital billboards that face the Internet and are located close to roads, thereby eliminating the need to physically approach the attack scene, changing the exposure versus application balance,” he said.
Nassi said such attacks don’t leave any evidence at the attack scene, they don’t require any complex preparation, and they can be applied with cheap equipment.
While previous attacks that exploited this validation gap required skilled attackers to approach the scene of the attack, he demonstrated remote attacks can fool advanced systems by using easily accessible equipment like a drone and a projector.
In practice, depth-less objects that are projected on the road are considered real even though depth sensors exist.
The researchers, the university said, “believe that this is the result of a ‘better safe than sorry’ policy that causes the car to consider a visual 2-D object real.”
Addressing the issue of what automobile companies using automated technology should be doing to correct this major flaw, Nassi suggested a countermeasure “that is able to determine whether a detected object is real or not.”