Federated Neural Radiance Fields
- URL: http://arxiv.org/abs/2305.01163v1
- Date: Tue, 2 May 2023 02:33:22 GMT
- Title: Federated Neural Radiance Fields
- Authors: Lachlan Holden, Feras Dayoub, David Harvey, Tat-Jun Chin
- Abstract summary: We consider training NeRFs in a federated manner, whereby multiple compute nodes, each having acquired a distinct set of observations of the overall scene, learn a common NeRF in parallel.
Our contribution is the first federated learning algorithm for NeRF, which splits the training effort across multiple compute nodes and obviates the need to pool the images at a central node.
A technique based on low-rank decomposition of NeRF layers is introduced to reduce bandwidth consumption to transmit the model parameters for aggregation.
- Score: 36.42289161746808
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The ability of neural radiance fields or NeRFs to conduct accurate 3D
modelling has motivated application of the technique to scene representation.
Previous approaches have mainly followed a centralised learning paradigm, which
assumes that all training images are available on one compute node for
training. In this paper, we consider training NeRFs in a federated manner,
whereby multiple compute nodes, each having acquired a distinct set of
observations of the overall scene, learn a common NeRF in parallel. This
supports the scenario of cooperatively modelling a scene using multiple agents.
Our contribution is the first federated learning algorithm for NeRF, which
splits the training effort across multiple compute nodes and obviates the need
to pool the images at a central node. A technique based on low-rank
decomposition of NeRF layers is introduced to reduce bandwidth consumption to
transmit the model parameters for aggregation. Transferring compressed models
instead of the raw data also contributes to the privacy of the data collecting
agents.
Related papers
- HyperPlanes: Hypernetwork Approach to Rapid NeRF Adaptation [4.53411151619456]
We propose a few-shot learning approach based on the hypernetwork paradigm that does not require gradient optimization during inference.
We have developed an efficient method for generating a high-quality 3D object representation from a small number of images in a single step.
arXiv Detail & Related papers (2024-02-02T16:10:29Z) - CorresNeRF: Image Correspondence Priors for Neural Radiance Fields [45.40164120559542]
CorresNeRF is a novel method that leverages image correspondence priors computed by off-the-shelf methods to supervise NeRF training.
We show that this simple yet effective technique of using correspondence priors can be applied as a plug-and-play module across different NeRF variants.
arXiv Detail & Related papers (2023-12-11T18:55:29Z) - SimpleNeRF: Regularizing Sparse Input Neural Radiance Fields with
Simpler Solutions [6.9980855647933655]
supervising the depth estimated by the NeRF helps train it effectively with fewer views.
We design augmented models that encourage simpler solutions by exploring the role of positional encoding and view-dependent radiance.
We achieve state-of-the-art view-synthesis performance on two popular datasets by employing the above regularizations.
arXiv Detail & Related papers (2023-09-07T18:02:57Z) - NeRFuser: Large-Scale Scene Representation by NeRF Fusion [35.749208740102546]
A practical benefit of implicit visual representations like Neural Radiance Fields (NeRFs) is their memory efficiency.
We propose NeRFuser, a novel architecture for NeRF registration and blending that assumes only access to pre-generated NeRFs.
arXiv Detail & Related papers (2023-05-22T17:59:05Z) - Registering Neural Radiance Fields as 3D Density Images [55.64859832225061]
We propose to use universal pre-trained neural networks that can be trained and tested on different scenes.
We demonstrate that our method, as a global approach, can effectively register NeRF models.
arXiv Detail & Related papers (2023-05-22T09:08:46Z) - Single-Stage Diffusion NeRF: A Unified Approach to 3D Generation and
Reconstruction [77.69363640021503]
3D-aware image synthesis encompasses a variety of tasks, such as scene generation and novel view synthesis from images.
We present SSDNeRF, a unified approach that employs an expressive diffusion model to learn a generalizable prior of neural radiance fields (NeRF) from multi-view images of diverse objects.
arXiv Detail & Related papers (2023-04-13T17:59:01Z) - Sparse Signal Models for Data Augmentation in Deep Learning ATR [0.8999056386710496]
We propose a data augmentation approach to incorporate domain knowledge and improve the generalization power of a data-intensive learning algorithm.
We exploit the sparsity of the scattering centers in the spatial domain and the smoothly-varying structure of the scattering coefficients in the azimuthal domain to solve the ill-posed problem of over-parametrized model fitting.
arXiv Detail & Related papers (2020-12-16T21:46:33Z) - iNeRF: Inverting Neural Radiance Fields for Pose Estimation [68.91325516370013]
We present iNeRF, a framework that performs mesh-free pose estimation by "inverting" a Neural RadianceField (NeRF)
NeRFs have been shown to be remarkably effective for the task of view synthesis.
arXiv Detail & Related papers (2020-12-10T18:36:40Z) - Explanation-Guided Training for Cross-Domain Few-Shot Classification [96.12873073444091]
Cross-domain few-shot classification task (CD-FSC) combines few-shot classification with the requirement to generalize across domains represented by datasets.
We introduce a novel training approach for existing FSC models.
We show that explanation-guided training effectively improves the model generalization.
arXiv Detail & Related papers (2020-07-17T07:28:08Z) - Pre-Trained Models for Heterogeneous Information Networks [57.78194356302626]
We propose a self-supervised pre-training and fine-tuning framework, PF-HIN, to capture the features of a heterogeneous information network.
PF-HIN consistently and significantly outperforms state-of-the-art alternatives on each of these tasks, on four datasets.
arXiv Detail & Related papers (2020-07-07T03:36:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.