QuietPrint: Protecting 3D Printers Against Acoustic Side-Channel Attacks
- URL: http://arxiv.org/abs/2602.02198v1
- Date: Mon, 02 Feb 2026 15:04:00 GMT
- Title: QuietPrint: Protecting 3D Printers Against Acoustic Side-Channel Attacks
- Authors: Seyed Ali Ghazi Asgar, Narasimha Reddy,
- Abstract summary: Cyber-attacks targeting the 3D printing process are becoming increasingly common.<n>One major concern is intellectual property (IP) theft, where a malicious attacker gains access to the design file.<n>In this work, we investigate the possibility of IP theft via acoustic side channels and propose a novel method to protect 3D printers.
- Score: 0.5729426778193398
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The 3D printing market has experienced significant growth in recent years, with an estimated revenue of 15 billion USD for 2025. Cyber-attacks targeting the 3D printing process whether through the machine itself, the supply chain, or the fabricated components are becoming increasingly common. One major concern is intellectual property (IP) theft, where a malicious attacker gains access to the design file. One method for carrying out such theft is through side-channel attacks. In this work, we investigate the possibility of IP theft via acoustic side channels and propose a novel method to protect 3D printers against such attacks. The primary advantage of our approach is that it requires no additional hardware, such as large speakers or noise-canceling devices. Instead, it secures printed parts by minimal modifications to the G-code.
Related papers
- Intellectual Property Protection for 3D Gaussian Splatting Assets: A Survey [89.1493370852336]
3D Gaussian Splatting (3DGS) has become a mainstream representation for real-time 3D scene synthesis, enabling applications in virtual and augmented reality, robotics, and 3D content creation.<n>Its rising commercial value and explicit parametric structure raise emerging intellectual property (IP) protection concerns.<n>Current progress remains fragmented, lacking a unified view of the underlying mechanisms, protection paradigms, and robustness challenges.
arXiv Detail & Related papers (2026-02-02T16:27:51Z) - Turning Hearsay into Discovery: Industrial 3D Printer Side Channel Information Translated to Stealing the Object Design [46.740145853674875]
We show for the first time that side-channel attacks are a serious threat to industrial grade 3D printers.<n>We reconstruct the 3D printed model solely from the collected power side-channel data.
arXiv Detail & Related papers (2025-09-22T19:46:21Z) - Evaluating the printability of stl files with ML [0.0]
Our approach introduces a novel layer of support by training an AI model to detect common issues in 3D models.<n>The goal is to assist less experienced users by identifying features that are likely to cause print failures.
arXiv Detail & Related papers (2025-09-15T19:37:00Z) - One Video to Steal Them All: 3D-Printing IP Theft through Optical Side-Channels [6.082508741253127]
We show that an adversary with access to video recordings of the 3D printing process can reverse engineer the underlying 3D print instructions.<n>Our model tracks the printer nozzle movements during the printing process and maps the corresponding trajectory into G-code instructions.<n>It identifies the correct parameters such as feed rate and extrusion rate, enabling successful intellectual property theft.
arXiv Detail & Related papers (2025-06-27T04:34:07Z) - PersGuard: Preventing Malicious Personalization via Backdoor Attacks on Pre-trained Text-to-Image Diffusion Models [51.458089902581456]
We introduce PersGuard, a novel backdoor-based approach that prevents malicious personalization of specific images.<n>Our method significantly outperforms existing techniques, offering a more robust solution for privacy and copyright protection.
arXiv Detail & Related papers (2025-02-22T09:47:55Z) - Practitioner Paper: Decoding Intellectual Property: Acoustic and Magnetic Side-channel Attack on a 3D Printer [3.0832643041058607]
This work demonstrates the feasibility of reconstructing G-codes by performing side-channel attacks on a 3D printer.
By training models using Gradient Boosted Decision Trees, our prediction results for each axial movement, stepper, nozzle, and rotor speed achieve high accuracy.
We effectively deploy the model in a real-world examination, achieving a Mean Tendency Error (MTE) of 4.47% on a plain G-code design.
arXiv Detail & Related papers (2024-11-16T21:05:25Z) - Poison-splat: Computation Cost Attack on 3D Gaussian Splatting [90.88713193520917]
We reveal a significant security vulnerability that has been largely overlooked in 3DGS.<n>The adversary can poison the input images to drastically increase the computation memory and time needed for 3DGS training.<n>Such a computation cost attack is achieved by addressing a bi-level optimization problem through three tailored strategies.
arXiv Detail & Related papers (2024-10-10T17:57:29Z) - Evaluating and Mitigating IP Infringement in Visual Generative AI [54.24196167576133]
State-of-the-art visual generative models can generate content that bears a striking resemblance to characters protected by intellectual property rights.
This happens when the input prompt contains the character's name or even just descriptive details about their characteristics.
We develop a revised generation paradigm that can identify potentially infringing generated content and prevent IP infringement.
arXiv Detail & Related papers (2024-06-07T06:14:18Z) - Stop Stealing My Data: Sanitizing Stego Channels in 3D Printing Design Files [56.96539046813698]
steganographic channels can allow additional data to be embedded within the STL files without changing the printed model.
This paper addresses this security threat by designing and evaluating a emphsanitizer that erases hidden content where steganographic channels might exist.
arXiv Detail & Related papers (2024-04-07T23:28:35Z) - Secure Information Embedding in Forensic 3D Fingerprinting [15.196378932114518]
We introduce SIDE, a novel fingerprinting framework tailored for 3D printing.<n>SIDE addresses the adversarial challenges of 3D print by offering both secure information embedding and extraction.
arXiv Detail & Related papers (2024-03-07T22:03:46Z) - Poisoned Forgery Face: Towards Backdoor Attacks on Face Forgery
Detection [62.595450266262645]
This paper introduces a novel and previously unrecognized threat in face forgery detection scenarios caused by backdoor attack.
By embedding backdoors into models, attackers can deceive detectors into producing erroneous predictions for forged faces.
We propose emphPoisoned Forgery Face framework, which enables clean-label backdoor attacks on face forgery detectors.
arXiv Detail & Related papers (2024-02-18T06:31:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.