Abstract: LiDARs play a critical role in Autonomous Vehicles' (AVs) perception and
their safe operations. Recent works have demonstrated that it is possible to
spoof LiDAR return signals to elicit fake objects. In this work we demonstrate
how the same physical capabilities can be used to mount a new, even more
dangerous class of attacks, namely Object Removal Attacks (ORAs). ORAs aim to
force 3D object detectors to fail. We leverage the default setting of LiDARs
that record a single return signal per direction to perturb point clouds in the
region of interest (RoI) of 3D objects. By injecting illegitimate points behind
the target object, we effectively shift points away from the target objects'
RoIs. Our initial results using a simple random point selection strategy show
that the attack is effective in degrading the performance of commonly used 3D
object detection models.