Multi-Point Integrated Sensing and Communication: Fusion Model and
Functionality Selection
- URL: http://arxiv.org/abs/2208.07592v1
- Date: Tue, 16 Aug 2022 08:09:54 GMT
- Title: Multi-Point Integrated Sensing and Communication: Fusion Model and
Functionality Selection
- Authors: Guoliang Li, Shuai Wang, Kejiang Ye, Miaowen Wen, Derrick Wing Kwan
Ng, Marco Di Renzo
- Abstract summary: This paper presents a multi-point ISAC (MPISAC) system that fuses the outputs from multiple ISAC devices for achieving higher sensing performance.
We adopt a fusion model that predicts the fusion accuracy via hypothesis testing and optimal voting analysis.
- Score: 99.67715229413986
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Integrated sensing and communication (ISAC) represents a paradigm shift,
where previously competing wireless transmissions are jointly designed to
operate in harmony via the shared use of the hardware platform for improving
the spectral, energy, and hardware efficiencies. However, due to adversarial
factors such as fading and blockages, ISAC without fusion may suffer from high
sensing uncertainties. This paper presents a multi-point ISAC (MPISAC) system
that fuses the outputs from multiple ISAC devices for achieving higher sensing
performance by exploiting multi-radar data redundancy. Furthermore, we propose
to effectively explore the performance trade-off between sensing and
communication via a functionality selection module that adaptively determines
the working state (i.e., sensing or communication) of an ISAC device. The crux
of our approach is to adopt a fusion model that predicts the fusion accuracy
via hypothesis testing and optimal voting analysis. Simulation results
demonstrate the superiority of MPISAC over various benchmark schemes and show
that the proposed approach can effectively span the trade-off region in ISAC
systems.
Related papers
- Multimodal Multi-loss Fusion Network for Sentiment Analysis [3.8611070161950902]
This paper investigates the optimal selection and fusion of feature encoders across multiple modalities to improve sentiment detection.
We compare different fusion methods and examine the impact of multi-loss training within the multi-modality fusion network.
We have found that integrating context significantly enhances model performance.
arXiv Detail & Related papers (2023-08-01T03:54:27Z) - Integrated Sensing, Computation, and Communication for UAV-assisted
Federated Edge Learning [52.7230652428711]
Federated edge learning (FEEL) enables privacy-preserving model training through periodic communication between edge devices and the server.
Unmanned Aerial Vehicle (UAV)mounted edge devices are particularly advantageous for FEEL due to their flexibility and mobility in efficient data collection.
arXiv Detail & Related papers (2023-06-05T16:01:33Z) - Task-Oriented Sensing, Computation, and Communication Integration for
Multi-Device Edge AI [108.08079323459822]
This paper studies a new multi-intelligent edge artificial-latency (AI) system, which jointly exploits the AI model split inference and integrated sensing and communication (ISAC)
We measure the inference accuracy by adopting an approximate but tractable metric, namely discriminant gain.
arXiv Detail & Related papers (2022-07-03T06:57:07Z) - Transformer-based Network for RGB-D Saliency Detection [82.6665619584628]
Key to RGB-D saliency detection is to fully mine and fuse information at multiple scales across the two modalities.
We show that transformer is a uniform operation which presents great efficacy in both feature fusion and feature enhancement.
Our proposed network performs favorably against state-of-the-art RGB-D saliency detection methods.
arXiv Detail & Related papers (2021-12-01T15:53:58Z) - Bi-Bimodal Modality Fusion for Correlation-Controlled Multimodal
Sentiment Analysis [96.46952672172021]
Bi-Bimodal Fusion Network (BBFN) is a novel end-to-end network that performs fusion on pairwise modality representations.
Model takes two bimodal pairs as input due to known information imbalance among modalities.
arXiv Detail & Related papers (2021-07-28T23:33:42Z) - Accelerating Edge Intelligence via Integrated Sensing and Communication [37.94664609065957]
This paper proposes to accelerate edge intelligence via integrated sensing and communication (ISAC)
As such, the sensing and communication stages are merged so as to make the best use of the wireless signals for the dual purpose of dataset generation and uploading.
Globally optimal solution is derived via the rank-1 guaranteed semidefinite relaxation, and performance analysis is performed to quantify the ISAC gain.
arXiv Detail & Related papers (2021-07-20T15:42:06Z) - Modular Multi Target Tracking Using LSTM Networks [0.0]
This paper proposes a model free end-to-end approach for airborne target tracking system using sensor measurements.
The proposed modular blocks can be independently trained and used in multitude of tracking applications.
arXiv Detail & Related papers (2020-11-16T15:58:49Z) - Mutual Information for Explainable Deep Learning of Multiscale Systems [1.1470070927586016]
We develop a model-agnostic, moment-independent global sensitivity analysis (GSA)
GSA relies on differential mutual information to rank the effects of CVs on QoIs.
We demonstrate that the surrogate-driven mutual information GSA provides useful and distinguishable rankings on two applications of interest in energy storage.
arXiv Detail & Related papers (2020-09-07T18:26:21Z) - RGB-D Salient Object Detection with Cross-Modality Modulation and
Selection [126.4462739820643]
We present an effective method to progressively integrate and refine the cross-modality complementarities for RGB-D salient object detection (SOD)
The proposed network mainly solves two challenging issues: 1) how to effectively integrate the complementary information from RGB image and its corresponding depth map, and 2) how to adaptively select more saliency-related features.
arXiv Detail & Related papers (2020-07-14T14:22:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.