UltraGS: Gaussian Splatting for Ultrasound Novel View Synthesis
- URL: http://arxiv.org/abs/2511.07743v1
- Date: Wed, 12 Nov 2025 01:14:28 GMT
- Title: UltraGS: Gaussian Splatting for Ultrasound Novel View Synthesis
- Authors: Yuezhe Yang, Wenjie Cai, Dexin Yang, Yufang Dong, Xingbo Dong, Zhe Jin,
- Abstract summary: We propose textbfUltraGS, a Gaussian Splatting framework optimized for ultrasound imaging.<n>First, we introduce a depth-aware Gaussian splatting strategy, where each Gaussian is assigned a learnable field of view.<n>Second, we design SH-DARS, a lightweight rendering function combining low-order spherical harmonics with ultrasound-specific wave physics.
- Score: 11.32869758723533
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Ultrasound imaging is a cornerstone of non-invasive clinical diagnostics, yet its limited field of view complicates novel view synthesis. We propose \textbf{UltraGS}, a Gaussian Splatting framework optimized for ultrasound imaging. First, we introduce a depth-aware Gaussian splatting strategy, where each Gaussian is assigned a learnable field of view, enabling accurate depth prediction and precise structural representation. Second, we design SH-DARS, a lightweight rendering function combining low-order spherical harmonics with ultrasound-specific wave physics, including depth attenuation, reflection, and scattering, to model tissue intensity accurately. Third, we contribute the Clinical Ultrasound Examination Dataset, a benchmark capturing diverse anatomical scans under real-world clinical protocols. Extensive experiments on three datasets demonstrate UltraGS's superiority, achieving state-of-the-art results in PSNR (up to 29.55), SSIM (up to 0.89), and MSE (as low as 0.002) while enabling real-time synthesis at 64.69 fps. The code and dataset are open-sourced at: https://github.com/Bean-Young/UltraGS.
Related papers
- DiffUS: Differentiable Ultrasound Rendering from Volumetric Imaging [0.0]
Intraoperative ultrasound imaging provides real-time guidance during numerous surgical procedures.<n>But its interpretation is complicated by noise, volumetric artifacts, and poor alignment with high-resolution preoperative MRI/CT scans.<n>We present DiffUS, a physics-based reflection that synthesizes realistic B-mode images from volumetric imaging.
arXiv Detail & Related papers (2025-08-09T01:04:11Z) - OpenPros: A Large-Scale Dataset for Limited View Prostate Ultrasound Computed Tomography [25.844490531325537]
Prostate cancer is one of the most common and lethal cancers among men.<n>Traditional transrectal ultrasound methods suffer from low sensitivity, especially in detecting anteriorly located tumors.<n>OpenPros is the first large-scale benchmark dataset explicitly developed for limited-view prostate USCT.
arXiv Detail & Related papers (2025-05-18T06:56:49Z) - UltraGauss: Ultrafast Gaussian Reconstruction of 3D Ultrasound Volumes [15.02330703285484]
2D-to-3D reconstructions are often computationally expensive, memory-intensive, or incompatible with ultrasound physics.<n>We introduce UltraGauss: the first ultrasound-specific Gaussian Splatting framework, extending view synthesis techniques to ultrasound wave propagation.<n>On real clinical ultrasound data, UltraGauss achieves state-of-the-art reconstructions in 5 minutes, and reaching 0.99 SSIM within 20 minutes on a single image.
arXiv Detail & Related papers (2025-05-08T20:53:47Z) - UltraRay: Introducing Full-Path Ray Tracing in Physics-Based Ultrasound Simulation [40.609593860836554]
We propose a novel ultrasound simulation pipeline that utilizes a ray tracing algorithm to generate echo data.<n>To replicate advanced ultrasound imaging, we introduce a ray emission scheme optimized for plane wave imaging, incorporating delay and steering capabilities.<n>In doing so, our proposed approach, UltraRay, not only enhances the overall visual quality but also improves the realism of the simulated images.
arXiv Detail & Related papers (2025-01-10T10:07:41Z) - UlRe-NeRF: 3D Ultrasound Imaging through Neural Rendering with Ultrasound Reflection Direction Parameterization [0.5837446811360741]
Traditional 3D ultrasound imaging methods have limitations such as fixed resolution, low storage efficiency, and insufficient contextual connectivity.
We propose a new model, UlRe-NeRF, which combines implicit neural networks and explicit ultrasound rendering architecture.
Experimental results demonstrate that the UlRe-NeRF model significantly enhances the realism and accuracy of high-fidelity ultrasound image reconstruction.
arXiv Detail & Related papers (2024-08-01T18:22:29Z) - R$^2$-Gaussian: Rectifying Radiative Gaussian Splatting for Tomographic Reconstruction [53.19869886963333]
3D Gaussian splatting (3DGS) has shown promising results in rendering image and surface reconstruction.
This paper introduces R2$-Gaussian, the first 3DGS-based framework for sparse-view tomographic reconstruction.
arXiv Detail & Related papers (2024-05-31T08:39:02Z) - CathFlow: Self-Supervised Segmentation of Catheters in Interventional Ultrasound Using Optical Flow and Transformers [66.15847237150909]
We introduce a self-supervised deep learning architecture to segment catheters in longitudinal ultrasound images.
The network architecture builds upon AiAReSeg, a segmentation transformer built with the Attention in Attention mechanism.
We validated our model on a test dataset, consisting of unseen synthetic data and images collected from silicon aorta phantoms.
arXiv Detail & Related papers (2024-03-21T15:13:36Z) - UNICORN: Ultrasound Nakagami Imaging via Score Matching and Adaptation [59.91293113930909]
Nakagami imaging holds promise for visualizing and quantifying tissue scattering in ultrasound waves.
Existing methods struggle with optimal window size selection and suffer from estimator instability.
We propose a novel method called UNICORN that offers an accurate, closed-form estimator for Nakagami parameter estimation.
arXiv Detail & Related papers (2024-03-10T18:05:41Z) - BrainVoxGen: Deep learning framework for synthesis of Ultrasound to MRI [2.982610402087728]
The work proposes a novel deep-learning framework for the synthesis of three-dimensional MRI volumes from corresponding 3D ultrasound images of the brain.
This research holds promise for transformative applications in medical diagnostics and treatment planning within the neuroimaging domain.
arXiv Detail & Related papers (2023-10-11T20:37:59Z) - AiAReSeg: Catheter Detection and Segmentation in Interventional
Ultrasound using Transformers [75.20925220246689]
endovascular surgeries are performed using the golden standard of Fluoroscopy, which uses ionising radiation to visualise catheters and vasculature.
This work proposes a solution using an adaptation of a state-of-the-art machine learning transformer architecture to detect and segment catheters in axial interventional Ultrasound image sequences.
arXiv Detail & Related papers (2023-09-25T19:34:12Z) - OADAT: Experimental and Synthetic Clinical Optoacoustic Data for
Standardized Image Processing [62.993663757843464]
Optoacoustic (OA) imaging is based on excitation of biological tissues with nanosecond-duration laser pulses followed by detection of ultrasound waves generated via light-absorption-mediated thermoelastic expansion.
OA imaging features a powerful combination between rich optical contrast and high resolution in deep tissues.
No standardized datasets generated with different types of experimental set-up and associated processing methods are available to facilitate advances in broader applications of OA in clinical settings.
arXiv Detail & Related papers (2022-06-17T08:11:26Z) - Deep Learning for Ultrasound Beamforming [120.12255978513912]
Beamforming, the process of mapping received ultrasound echoes to the spatial image domain, lies at the heart of the ultrasound image formation chain.
Modern ultrasound imaging leans heavily on innovations in powerful digital receive channel processing.
Deep learning methods can play a compelling role in the digital beamforming pipeline.
arXiv Detail & Related papers (2021-09-23T15:15:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.