A novel and dangerous form of attack against autonomous vehicles has been identified, known as LAR blinding. This technique is a physical-layer attack that uses coordinated lasers to interfere with the optical sensors that self-driving cars rely on for navigation and obstacle avoidance. By projecting specific laser patterns, attackers can jam the sensors (e.g., LiDAR, cameras) or, more insidiously, inject false data to create "ghost" objects or erase real ones. This effectively blinds the vehicle to its true surroundings, creating an immediate and high-risk scenario for collisions, endangering passengers, pedestrians, and other vehicles.
LAR blinding is a type of sensor-level, cyber-physical attack. It does not require hacking the car's software but instead manipulates the physical inputs to that software. The attack works by overwhelming the car's optical sensors with laser light.
The term "coordinated" suggests that multiple lasers might be used in concert to create a more comprehensive and confusing false reality for the vehicle's perception system.
The primary targets of LAR blinding are the optical sensors essential for autonomous driving:
This attack vector is particularly challenging because it bypasses traditional cybersecurity defenses focused on software and networks. It attacks the very perception of reality that the autonomous system is built upon.
While designed for Industrial Control Systems, the concepts map well to this cyber-physical threat:
T0882 - Manipulation of I/O: The core of the attack is manipulating the input (light) to the sensor (I/O device) to cause an incorrect digital representation of the physical world.T0884 - Inhibit Response Function: By blinding the sensors, the attack inhibits the car's primary response function: obstacle avoidance.T0826 - Loss of View: The jamming aspect of the attack directly causes a loss of view for the vehicle's perception system.The impact of a successful LAR blinding attack is direct, physical, and potentially lethal.
Detecting and responding to physical-layer attacks requires a multi-modal approach.
D3-RPA - Relayed Protocol Analysis concept, but applied to physical sensors.Mitigation involves making the sensors themselves more resilient and the overall system less brittle.
Using multiple, diverse sensor types (LiDAR, radar, cameras) provides redundancy. An attack on one type of sensor can be detected and mitigated by relying on the others.
The system should constantly analyze the data from all its sensors to ensure it forms a coherent and physically plausible model of the world. Discrepancies indicate a fault or attack.
While D3FEND's Relayed Protocol Analysis is typically for network protocols, the concept can be adapted for sensor fusion to counter 'LAR Blinding'. The autonomous vehicle's central processing unit must act as an analysis engine for the 'protocols' of its physical sensors. It should continuously compare the world models generated by its disparate sensors (LiDAR, radar, cameras, ultrasonics). If the LiDAR reports a clear path while the radar reports a large metal object (a car), this is a critical discrepancy. The system should be programmed to identify this conflict as a potential sensor compromise. By analyzing and cross-validating these relayed 'messages' from the physical world, the system can detect the attack and enter a safe state, rather than trusting the manipulated data from the blinded optical sensors.
To mitigate LAR Blinding, Platform Hardening must be applied to the physical sensor suite. This involves several technical approaches. First, equipping cameras and LiDAR sensors with narrow band-pass filters that only allow light at their specific operating frequency (e.g., 905nm or 1550nm for LiDAR) to pass through. This can filter out attacks from common, off-the-shelf lasers operating at different wavelengths. Second, implementing randomized scanning patterns for LiDAR systems, which makes it significantly harder for an attacker to synchronize their malicious pulses with the sensor's receiver. Third, using polarization filters, as laser light is often polarized, which can help distinguish it from natural, unpolarized ambient light. These hardware-level hardening techniques make the sensor platform itself more resilient to physical-layer manipulation.

Cybersecurity professional with over 10 years of specialized experience in security operations, threat intelligence, incident response, and security automation. Expertise spans SOAR/XSOAR orchestration, threat intelligence platforms, SIEM/UEBA analytics, and building cyber fusion centers. Background includes technical enablement, solution architecture for enterprise and government clients, and implementing security automation workflows across IR, TIP, and SOC use cases.
Help others stay informed about cybersecurity threats