Diagonal Artifacts in Samsung Images: PRNU Challenges and Solutions
- URL: http://arxiv.org/abs/2510.09509v1
- Date: Fri, 10 Oct 2025 16:14:29 GMT
- Title: Diagonal Artifacts in Samsung Images: PRNU Challenges and Solutions
- Authors: David Vázquez-Padín, Fernando Pérez-González, Alejandro Martín-Del-Río,
- Abstract summary: diagonal artifacts present in images captured by several Samsung smartphones.<n>We first show that certain Galaxy S series models share a common pattern causing fingerprint collisions.<n>We demonstrate that reliable PRNU verification remains feasible for devices supporting PRO mode with raw capture.
- Score: 81.64274598548052
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We investigate diagonal artifacts present in images captured by several Samsung smartphones and their impact on PRNU-based camera source verification. We first show that certain Galaxy S series models share a common pattern causing fingerprint collisions, with a similar issue also found in some Galaxy A models. Next, we demonstrate that reliable PRNU verification remains feasible for devices supporting PRO mode with raw capture, since raw images bypass the processing pipeline that introduces artifacts. This option, however, is not available for the mid-range A series models or in forensic cases without access to raw images. Finally, we outline potential forensic applications of the diagonal artifacts, such as reducing misdetections in HDR images and localizing regions affected by synthetic bokeh in portrait-mode images.
Related papers
- Apple's Synthetic Defocus Noise Pattern: Characterization and Forensic Applications [46.700770585652634]
iPhone portrait-mode images contain a distinctive pattern in out-of-focus regions simulating the bokeh effect.<n>This pattern can interfere with blind forensic analyses, especially PRNU-based camera source verification.<n>We show that masking SDNP-affected regions in PRNU-based camera source verification significantly reduces false positives.
arXiv Detail & Related papers (2025-05-12T09:27:20Z) - Zero-Shot Detection of AI-Generated Images [54.01282123570917]
We propose a zero-shot entropy-based detector (ZED) to detect AI-generated images.
Inspired by recent works on machine-generated text detection, our idea is to measure how surprising the image under analysis is compared to a model of real images.
ZED achieves an average improvement of more than 3% over the SoTA in terms of accuracy.
arXiv Detail & Related papers (2024-09-24T08:46:13Z) - BSRAW: Improving Blind RAW Image Super-Resolution [63.408484584265985]
We tackle blind image super-resolution in the RAW domain.
We design a realistic degradation pipeline tailored specifically for training models with raw sensor data.
Our BSRAW models trained with our pipeline can upscale real-scene RAW images and improve their quality.
arXiv Detail & Related papers (2023-12-24T14:17:28Z) - Unsupervised Denoising for Signal-Dependent and Row-Correlated Imaging Noise [54.0185721303932]
We present the first fully unsupervised deep learning-based denoiser capable of handling imaging noise that is row-correlated as well as signal-dependent.<n>Our approach uses a Variational Autoencoder with a specially designed autoregressive decoder.<n>Our method does not require a pre-trained noise model and can be trained from scratch using unpaired noisy data.
arXiv Detail & Related papers (2023-10-11T20:48:20Z) - Perceptual Image Enhancement for Smartphone Real-Time Applications [60.45737626529091]
We propose LPIENet, a lightweight network for perceptual image enhancement.
Our model can deal with noise artifacts, diffraction artifacts, blur, and HDR overexposure.
Our model can process 2K resolution images under 1 second in mid-level commercial smartphones.
arXiv Detail & Related papers (2022-10-24T19:16:33Z) - Beyond PRNU: Learning Robust Device-Specific Fingerprint for Source
Camera Identification [14.404497406560104]
Source camera identification tools assist image forensic investigators to associate an image in question with a suspect camera.
Photo Response Non Uniformity (PRNU) noise pattern caused by sensor imperfections has been proven to be an effective way to identify the source camera.
PRNU is susceptible to camera settings, image content, image processing operations, and counter-forensic attacks.
New device fingerprint is extracted from the low and mid-frequency bands, which resolves the fragility issue that the PRNU is unable to contend with.
arXiv Detail & Related papers (2021-11-03T11:25:19Z) - MToFNet: Object Anti-Spoofing with Mobile Time-of-Flight Data [9.632104433799256]
In online markets, sellers can maliciously recapture others' images on display screens to utilize as spoof images.
We propose an anti-spoofing method using the paired images and depth maps provided by the mobile camera with a Time-of-Fight sensor.
We build a novel representation model composed of two embedding models, which can be trained without considering the recaptured images.
arXiv Detail & Related papers (2021-10-06T05:24:33Z) - A leak in PRNU based source identification. Questioning fingerprint
uniqueness [75.33542585238497]
Photo Response Non-Uniformity (PRNU) is considered the most effective trace for the image source attribution task.
Recent devices may introduce non-unique artifacts that may reduce PRNU noise's distinctiveness.
We show that the primary cause of high false alarm rates cannot be directly related to specific camera models, firmware, or image contents.
arXiv Detail & Related papers (2020-09-10T14:18:38Z) - On the Reliability of the PNU for Source Camera Identification Tasks [2.885175627590247]
The PNU is an essential tool to perform SCI and, during the years, became a standard de-facto for this task in the forensic field.
We show that, although strategies exist that aim to cancel, modify, replace the PNU traces in a digital camera image, it is still possible, through our experimental method, to find residual traces of the noise produced by the sensor used to shoot the photo.
arXiv Detail & Related papers (2020-08-28T15:15:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.