Dencentralized learning in the presence of low-rank noise
- URL: http://arxiv.org/abs/2203.09810v1
- Date: Fri, 18 Mar 2022 09:13:57 GMT
- Title: Dencentralized learning in the presence of low-rank noise
- Authors: Roula Nassif, Virginia Bordignon, Stefan Vlaski, Ali H. Sayed
- Abstract summary: Observations collected by agents in a network may be unreliable due to observation noise or interference.
This paper proposes a distributed algorithm that allows each node to improve the reliability of its own observation.
- Score: 57.18977364494388
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Observations collected by agents in a network may be unreliable due to
observation noise or interference. This paper proposes a distributed algorithm
that allows each node to improve the reliability of its own observation by
relying solely on local computations and interactions with immediate neighbors,
assuming that the field (graph signal) monitored by the network lies in a
low-dimensional subspace and that a low-rank noise is present in addition to
the usual full-rank noise. While oblique projections can be used to project
measurements onto a low-rank subspace along a direction that is oblique to the
subspace, the resulting solution is not distributed. Starting from the
centralized solution, we propose an algorithm that performs the oblique
projection of the overall set of observations onto the signal subspace in an
iterative and distributed manner. We then show how the oblique projection
framework can be extended to handle distributed learning and adaptation
problems over networks.
Related papers
- Subspace Defense: Discarding Adversarial Perturbations by Learning a Subspace for Clean Signals [52.123343364599094]
adversarial attacks place carefully crafted perturbations on normal examples to fool deep neural networks (DNNs)
We first empirically show that the features of either clean signals or adversarial perturbations are redundant and span in low-dimensional linear subspaces respectively with minimal overlap.
This makes it possible for DNNs to learn a subspace where only features of clean signals exist while those of perturbations are discarded.
arXiv Detail & Related papers (2024-03-24T14:35:44Z) - Distributed Bayesian Estimation in Sensor Networks: Consensus on
Marginal Densities [15.038649101409804]
We derive a distributed provably-correct algorithm in the functional space of probability distributions over continuous variables.
We leverage these results to obtain new distributed estimators restricted to subsets of variables observed by individual agents.
This relates to applications such as cooperative localization and federated learning, where the data collected at any agent depends on a subset of all variables of interest.
arXiv Detail & Related papers (2023-12-02T21:10:06Z) - SubspaceNet: Deep Learning-Aided Subspace Methods for DoA Estimation [36.647703652676626]
SubspaceNet is a data-driven DoA estimator which learns how to divide the observations into distinguishable subspaces.
SubspaceNet is shown to enable various DoA estimation algorithms to cope with coherent sources, wideband signals, low SNR, array mismatches, and limited snapshots.
arXiv Detail & Related papers (2023-06-04T06:30:13Z) - Exploring Efficient Asymmetric Blind-Spots for Self-Supervised Denoising in Real-World Scenarios [44.31657750561106]
Noise in real-world scenarios is often spatially correlated, which causes many self-supervised algorithms to perform poorly.
We propose Asymmetric Tunable Blind-Spot Network (AT-BSN), where the blind-spot size can be freely adjusted.
We show that our method achieves state-of-the-art, and is superior to other self-supervised algorithms in terms of computational overhead and visual effects.
arXiv Detail & Related papers (2023-03-29T15:19:01Z) - Fast ABC with joint generative modelling and subset simulation [0.6445605125467573]
We propose a novel approach for solving inverse-problems with high-dimensional inputs and an expensive forward mapping.
It leverages joint deep generative modelling to transfer the original problem spaces to a lower dimensional latent space.
arXiv Detail & Related papers (2021-04-16T15:03:23Z) - Provable Generalization of SGD-trained Neural Networks of Any Width in
the Presence of Adversarial Label Noise [85.59576523297568]
We consider a one-hidden-layer leaky ReLU network of arbitrary width trained by gradient descent.
We prove that SGD produces neural networks that have classification accuracy competitive with that of the best halfspace over the distribution.
arXiv Detail & Related papers (2021-01-04T18:32:49Z) - Non-Local Spatial Propagation Network for Depth Completion [82.60915972250706]
We propose a robust and efficient end-to-end non-local spatial propagation network for depth completion.
The proposed network takes RGB and sparse depth images as inputs and estimates non-local neighbors and their affinities of each pixel.
We show that the proposed algorithm is superior to conventional algorithms in terms of depth completion accuracy and robustness to the mixed-depth problem.
arXiv Detail & Related papers (2020-07-20T12:26:51Z) - Simultaneous Denoising and Dereverberation Using Deep Embedding Features [64.58693911070228]
We propose a joint training method for simultaneous speech denoising and dereverberation using deep embedding features.
At the denoising stage, the DC network is leveraged to extract noise-free deep embedding features.
At the dereverberation stage, instead of using the unsupervised K-means clustering algorithm, another neural network is utilized to estimate the anechoic speech.
arXiv Detail & Related papers (2020-04-06T06:34:01Z) - Spatially Adaptive Inference with Stochastic Feature Sampling and
Interpolation [72.40827239394565]
We propose to compute features only at sparsely sampled locations.
We then densely reconstruct the feature map with an efficient procedure.
The presented network is experimentally shown to save substantial computation while maintaining accuracy over a variety of computer vision tasks.
arXiv Detail & Related papers (2020-03-19T15:36:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.