Deep Learning-Aided Subspace-Based DOA Recovery for Sparse Arrays
- URL: http://arxiv.org/abs/2309.05109v2
- Date: Sun, 17 Dec 2023 17:45:19 GMT
- Title: Deep Learning-Aided Subspace-Based DOA Recovery for Sparse Arrays
- Authors: Yoav Amiel, Dor H. Shmuel, Nir Shlezinger, and Wasim Huleihel
- Abstract summary: We propose Sparse-SubspaceNet, which leverages deep learning to enable subspace-based DoAs in sparse arrays.
By doing so, we learn to cope with coherent sources and miscalibrated sparse arrays, while preserving the interpretability and the suitability of model-based subspace DoA estimators.
- Score: 25.776724012525662
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sparse arrays enable resolving more direction of arrivals (DoAs) than antenna
elements using non-uniform arrays. This is typically achieved by reconstructing
the covariance of a virtual large uniform linear array (ULA), which is then
processed by subspace DoA estimators. However, these method assume that the
signals are non-coherent and the array is calibrated; the latter often
challenging to achieve in sparse arrays, where one cannot access the virtual
array elements. In this work, we propose Sparse-SubspaceNet, which leverages
deep learning to enable subspace-based DoA recovery from sparse miscallibrated
arrays with coherent sources. Sparse- SubspaceNet utilizes a dedicated deep
network to learn from data how to compute a surrogate virtual array covariance
that is divisible into distinguishable subspaces. By doing so, we learn to cope
with coherent sources and miscalibrated sparse arrays, while preserving the
interpretability and the suitability of model-based subspace DoA estimators.
Related papers
- Subspace Representation Learning for Sparse Linear Arrays to Localize More Sources than Sensors: A Deep Learning Methodology [19.100476521802243]
We develop a novel methodology that estimates the co-array subspaces from a sample covariance for sparse linear array (SLA)
To learn such representations, we propose loss functions that gauge the separation between the desired and the estimated subspace.
The computation of learning subspaces of different dimensions is accelerated by a new batch sampling strategy.
arXiv Detail & Related papers (2024-08-29T15:14:52Z) - Sparse Array Design for Direction Finding using Deep Learning [19.061021605579683]
deep learning (DL) techniques have been introduced for designing sparse arrays.
This chapter provides a synopsis of several direction finding applications of DL-based sparse arrays.
arXiv Detail & Related papers (2023-08-08T22:45:48Z) - SubspaceNet: Deep Learning-Aided Subspace Methods for DoA Estimation [36.647703652676626]
SubspaceNet is a data-driven DoA estimator which learns how to divide the observations into distinguishable subspaces.
SubspaceNet is shown to enable various DoA estimation algorithms to cope with coherent sources, wideband signals, low SNR, array mismatches, and limited snapshots.
arXiv Detail & Related papers (2023-06-04T06:30:13Z) - Learning Structure Aware Deep Spectral Embedding [11.509692423756448]
We propose a novel structure-aware deep spectral embedding by combining a spectral embedding loss and a structure preservation loss.
A deep neural network architecture is proposed that simultaneously encodes both types of information and aims to generate structure-aware spectral embedding.
The proposed algorithm is evaluated on six publicly available real-world datasets.
arXiv Detail & Related papers (2023-05-14T18:18:05Z) - Intrinsic dimension estimation for discrete metrics [65.5438227932088]
In this letter we introduce an algorithm to infer the intrinsic dimension (ID) of datasets embedded in discrete spaces.
We demonstrate its accuracy on benchmark datasets, and we apply it to analyze a metagenomic dataset for species fingerprinting.
This suggests that evolutive pressure acts on a low-dimensional manifold despite the high-dimensionality of sequences' space.
arXiv Detail & Related papers (2022-07-20T06:38:36Z) - Semi-Supervised Subspace Clustering via Tensor Low-Rank Representation [64.49871502193477]
We propose a novel semi-supervised subspace clustering method, which is able to simultaneously augment the initial supervisory information and construct a discriminative affinity matrix.
Comprehensive experimental results on six commonly-used benchmark datasets demonstrate the superiority of our method over state-of-the-art methods.
arXiv Detail & Related papers (2022-05-21T01:47:17Z) - High-Dimensional Sparse Bayesian Learning without Covariance Matrices [66.60078365202867]
We introduce a new inference scheme that avoids explicit construction of the covariance matrix.
Our approach couples a little-known diagonal estimation result from numerical linear algebra with the conjugate gradient algorithm.
On several simulations, our method scales better than existing approaches in computation time and memory.
arXiv Detail & Related papers (2022-02-25T16:35:26Z) - Unfolding Projection-free SDP Relaxation of Binary Graph Classifier via
GDPA Linearization [59.87663954467815]
Algorithm unfolding creates an interpretable and parsimonious neural network architecture by implementing each iteration of a model-based algorithm as a neural layer.
In this paper, leveraging a recent linear algebraic theorem called Gershgorin disc perfect alignment (GDPA), we unroll a projection-free algorithm for semi-definite programming relaxation (SDR) of a binary graph.
Experimental results show that our unrolled network outperformed pure model-based graph classifiers, and achieved comparable performance to pure data-driven networks but using far fewer parameters.
arXiv Detail & Related papers (2021-09-10T07:01:15Z) - A Critique of Self-Expressive Deep Subspace Clustering [23.971512395191308]
Subspace clustering is an unsupervised clustering technique designed to cluster data that is supported on a union of linear subspaces.
We show that there are a number of potential flaws with this approach which have not been adequately addressed in prior work.
arXiv Detail & Related papers (2020-10-08T00:14:59Z) - Ellipsoidal Subspace Support Vector Data Description [98.67884574313292]
We propose a novel method for transforming data into a low-dimensional space optimized for one-class classification.
We provide both linear and non-linear formulations for the proposed method.
The proposed method is noticed to converge much faster than recently proposed Subspace Support Vector Data Description.
arXiv Detail & Related papers (2020-03-20T21:31:03Z) - Spatially Adaptive Inference with Stochastic Feature Sampling and
Interpolation [72.40827239394565]
We propose to compute features only at sparsely sampled locations.
We then densely reconstruct the feature map with an efficient procedure.
The presented network is experimentally shown to save substantial computation while maintaining accuracy over a variety of computer vision tasks.
arXiv Detail & Related papers (2020-03-19T15:36:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.