“Ghost objects” can disrupt the driving of self-driving cars | Self-driving car news

Technology News

Israeli researchers have indicated that self-driving cars could be disturbed by “ghost objects”. Invisible to the naked eye, but taken into account by the device of the vehicle, these objects can have a dangerous impact on the driving of autonomous cars.

Self-driving car news

In 2019, the Japanese manufacturer Toyota recalled that no manufacturer is today able to market a level 5 autonomous car. This is the highest level of autonomy, synonymous with fully autonomous driving. For now, Tesla vehicles are the most advanced, oscillating between levels 2 and 3. Autonomous vehicles are therefore very far from being driven without any user intervention along the route. However, these same vehicles could evolve in certain areas.

Disrupted systems

According to a study by Ben-Gurion University in the Negev (Israel), certain elements have an impact on the driving of autonomous vehicles. We are talking about certain “ghost objects” that can disrupt artificial intelligence systems. These ghost objects are invisible to the driver but are taken into account by the system. Thus, the vehicle can decide to break or turn quickly. These may be display panels comprising flashing lights, capable of influencing the reading of the signage.

A solution remains to be found

Yisroel Mirsky, a safety expert who participated in the study, explains that these hijacked signs could cause braking and other swerves. It is a question of rapid reactions of the automobile without the driver understanding the reason. The directors of the study cite tests on Tesla vehicles with the latest version of Autopilot. During one of the tests, a hijacked stop sign visible to the vehicle for 0.42 seconds misled the latter. In addition, 1 / 8th of a second was enough for MobileEye, another driver assistance, and an anti-collision device.

In the near future, the directors of the study will give more results. Their goal is to find a solution because this kind of loophole could be exploited. A malicious individual would have the possibility to attack several vehicles at the same time.

In any case, autonomous vehicles still have a lot of progress to make. In 2018, an autonomous Uber car crashed into a cyclist crossing the road in Arizona (United States). A year later, the investigation determined that the software had not recognized it and did not take into account the possibility of pedestrians crossing out of the nails.

Leave a Reply

Your email address will not be published. Required fields are marked *