"LAR Blinding Attacks" Use Lasers to Deceive Autonomous Car Sensors, Creating Critical Collision Risks

"LAR Blinding Attacks" on Autonomous Cars Use Lasers to Trick Sensors

CRITICAL
March 22, 2026
5m read
CyberattackIoT Security

Related Entities

Products & Tech

Autonomous vehicle

Full Report

Executive Summary

A novel and dangerous form of attack against autonomous vehicles has been identified, known as LAR blinding. This technique is a physical-layer attack that uses coordinated lasers to interfere with the optical sensors that self-driving cars rely on for navigation and obstacle avoidance. By projecting specific laser patterns, attackers can jam the sensors (e.g., LiDAR, cameras) or, more insidiously, inject false data to create "ghost" objects or erase real ones. This effectively blinds the vehicle to its true surroundings, creating an immediate and high-risk scenario for collisions, endangering passengers, pedestrians, and other vehicles.

Threat Overview

LAR blinding is a type of sensor-level, cyber-physical attack. It does not require hacking the car's software but instead manipulates the physical inputs to that software. The attack works by overwhelming the car's optical sensors with laser light.

  • Jamming: A powerful laser can saturate the sensor's detector, effectively blinding it and creating a large blind spot in the car's field of view.
  • Spoofing/Manipulation: A more sophisticated attack involves modulating the laser to mimic the signals the sensor expects to receive. For a LiDAR sensor, this could mean sending back fake light pulses to create the illusion of an obstacle where there is none, or to interfere with the real reflections from an actual obstacle, making it invisible to the system.

The term "coordinated" suggests that multiple lasers might be used in concert to create a more comprehensive and confusing false reality for the vehicle's perception system.

Technical Analysis

The primary targets of LAR blinding are the optical sensors essential for autonomous driving:

  • LiDAR (Light Detection and Ranging): LiDAR works by sending out pulses of laser light and measuring the time it takes for them to reflect off objects. By firing a synchronized laser at the LiDAR's receiver, an attacker can create fake return signals, causing the system to perceive objects that are not there.
  • Cameras: Standard optical cameras can be temporarily or permanently damaged by high-powered lasers. They can also be tricked by projecting specific images or patterns that exploit the machine learning models used for object recognition.

This attack vector is particularly challenging because it bypasses traditional cybersecurity defenses focused on software and networks. It attacks the very perception of reality that the autonomous system is built upon.

MITRE ATT&CK for ICS Mapping

While designed for Industrial Control Systems, the concepts map well to this cyber-physical threat:

  • T0882 - Manipulation of I/O: The core of the attack is manipulating the input (light) to the sensor (I/O device) to cause an incorrect digital representation of the physical world.
  • T0884 - Inhibit Response Function: By blinding the sensors, the attack inhibits the car's primary response function: obstacle avoidance.
  • T0826 - Loss of View: The jamming aspect of the attack directly causes a loss of view for the vehicle's perception system.

Impact Assessment

The impact of a successful LAR blinding attack is direct, physical, and potentially lethal.

  • High Risk of Collision: An autonomous vehicle that cannot see a pedestrian, a stopped car, or a wall is almost certain to collide with it.
  • Loss of Life and Property: Such collisions can easily result in the death or serious injury of passengers and others, as well as significant property damage.
  • Erosion of Public Trust: High-profile incidents of this nature could completely derail public acceptance and regulatory approval of autonomous vehicle technology for years, regardless of how statistically safe they are in normal conditions.

Detection & Response

Detecting and responding to physical-layer attacks requires a multi-modal approach.

Detection Strategies

  1. Sensor Fusion Anomaly Detection: The key to defense is sensor fusion. An autonomous vehicle uses multiple sensor types (LiDAR, cameras, radar, ultrasonic). If the LiDAR and camera are being blinded by a laser, but the radar (which is immune to lasers) still detects an object, the system can identify a sensor discrepancy. This is an application of D3FEND's D3-RPA - Relayed Protocol Analysis concept, but applied to physical sensors.
  2. Light Source Detection: Specialized filters or secondary sensors could be designed to detect the specific, coherent, high-intensity light from a laser and distinguish it from ambient light or reflections.
  3. Unusual Sensor Data Patterns: A jammed sensor produces a very specific type of noise or data pattern (e.g., a solid block of max-range readings on a LiDAR scan). These patterns can be identified by the perception software as indicative of an attack rather than a sensor failure.

Response Actions

  • Safe State: Upon detecting a credible sensor attack, the vehicle must immediately enter a minimal risk condition. This could involve slowing down, pulling over to the side of the road, and alerting the human driver (if present) to take control.
  • Rely on Unaffected Sensors: The system should be programmed to trust the data from unaffected sensors (like radar) more heavily when an attack on optical sensors is detected.

Mitigation

Mitigation involves making the sensors themselves more resilient and the overall system less brittle.

Strategic Mitigation

  1. Sensor Redundancy and Diversity: As mentioned, using a diverse set of sensors (optical, radio-frequency, acoustic) is the most important mitigation. An attacker is unlikely to be able to jam all of them simultaneously with a single tool.
  2. Advanced Filtering: Use optical filters on cameras and LiDARs that can block light outside of the specific wavelengths they operate on, which can help reduce the impact of some lasers.
  3. Randomized Sensing: Introduce randomization into LiDAR scanning patterns or camera frame rates. This makes it much harder for an attacker to synchronize their laser with the sensor's operation to inject spoofed data.

Tactical Mitigation

  • Software Updates: Vehicle manufacturers must be able to rapidly deploy software updates to improve their sensor fusion algorithms and detection logic as new attack techniques are discovered.

Timeline of Events

1
March 22, 2026
This article was published

MITRE ATT&CK Mitigations

Using multiple, diverse sensor types (LiDAR, radar, cameras) provides redundancy. An attack on one type of sensor can be detected and mitigated by relying on the others.

The system should constantly analyze the data from all its sensors to ensure it forms a coherent and physically plausible model of the world. Discrepancies indicate a fault or attack.

D3FEND Defensive Countermeasures

While D3FEND's Relayed Protocol Analysis is typically for network protocols, the concept can be adapted for sensor fusion to counter 'LAR Blinding'. The autonomous vehicle's central processing unit must act as an analysis engine for the 'protocols' of its physical sensors. It should continuously compare the world models generated by its disparate sensors (LiDAR, radar, cameras, ultrasonics). If the LiDAR reports a clear path while the radar reports a large metal object (a car), this is a critical discrepancy. The system should be programmed to identify this conflict as a potential sensor compromise. By analyzing and cross-validating these relayed 'messages' from the physical world, the system can detect the attack and enter a safe state, rather than trusting the manipulated data from the blinded optical sensors.

To mitigate LAR Blinding, Platform Hardening must be applied to the physical sensor suite. This involves several technical approaches. First, equipping cameras and LiDAR sensors with narrow band-pass filters that only allow light at their specific operating frequency (e.g., 905nm or 1550nm for LiDAR) to pass through. This can filter out attacks from common, off-the-shelf lasers operating at different wavelengths. Second, implementing randomized scanning patterns for LiDAR systems, which makes it significantly harder for an attacker to synchronize their malicious pulses with the sensor's receiver. Third, using polarization filters, as laser light is often polarized, which can help distinguish it from natural, unpolarized ambient light. These hardware-level hardening techniques make the sensor platform itself more resilient to physical-layer manipulation.

Sources & References

Article Author

Jason Gomes

Jason Gomes

• Cybersecurity Practitioner

Cybersecurity professional with over 10 years of specialized experience in security operations, threat intelligence, incident response, and security automation. Expertise spans SOAR/XSOAR orchestration, threat intelligence platforms, SIEM/UEBA analytics, and building cyber fusion centers. Background includes technical enablement, solution architecture for enterprise and government clients, and implementing security automation workflows across IR, TIP, and SOC use cases.

Threat Intelligence & AnalysisSecurity Orchestration (SOAR/XSOAR)Incident Response & Digital ForensicsSecurity Operations Center (SOC)SIEM & Security AnalyticsCyber Fusion & Threat SharingSecurity Automation & IntegrationManaged Detection & Response (MDR)

Tags

Autonomous VehicleCyber-Physical AttackSensor AttackLiDARAutomotive SecurityHacking

📢 Share This Article

Help others stay informed about cybersecurity threats