Researchers have successfully deceived autonomous vehicle LiDAR systems using nothing more than mirrors attached to traffic cones, raising serious concerns about self-driving car safety.
How the Attack Works
The experiment involved placing mirrors on traffic cones to manipulate LiDAR readings. The results were alarming:
- Vehicles ignored real obstacles in their path
- Cars detected phantom barriers that didn’t exist
- Systems triggered unnecessary emergency stops
Success Rates
The mirror-based deception proved surprisingly effective:
- 65% success rate with two mirrors
- 74% success rate with six mirrors
Real-World Implications
The security flaw creates dangerous scenarios:
- Emergency braking on busy roads could cause rear-end collisions
- False object detection led to unnecessary evasive maneuvers
- One test showed the system identifying a fake object 20 meters away, causing the vehicle to swerve
Safety Concerns
This simple attack method highlights a critical vulnerability in current autonomous driving technology. The ability to fool LiDAR sensors with basic equipment raises questions about the readiness of self-driving cars for widespread deployment.
The research demonstrates that even low-tech methods can compromise sophisticated automotive safety systems, potentially putting passengers and other road users at risk.

