Model-Driven Beamforming Neural Networks
- URL: http://arxiv.org/abs/2001.05277v1
- Date: Wed, 15 Jan 2020 12:50:09 GMT
- Title: Model-Driven Beamforming Neural Networks
- Authors: Wenchao Xia, Gan Zheng, Kai-Kit Wong, and Hongbo Zhu
- Abstract summary: This article introduces general data- and model-driven beamforming neural networks (BNNs)
It presents various possible learning strategies, and also discusses complexity reduction for the DL-based BNNs.
We also offer enhancement methods such as training-set augmentation and transfer learning in order to improve the generality of BNNs.
- Score: 47.754731555563836
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Beamforming is evidently a core technology in recent generations of mobile
communication networks. Nevertheless, an iterative process is typically
required to optimize the parameters, making it ill-placed for real-time
implementation due to high complexity and computational delay. Heuristic
solutions such as zero-forcing (ZF) are simpler but at the expense of
performance loss. Alternatively, deep learning (DL) is well understood to be a
generalizing technique that can deliver promising results for a wide range of
applications at much lower complexity if it is sufficiently trained. As a
consequence, DL may present itself as an attractive solution to beamforming. To
exploit DL, this article introduces general data- and model-driven beamforming
neural networks (BNNs), presents various possible learning strategies, and also
discusses complexity reduction for the DL-based BNNs. We also offer enhancement
methods such as training-set augmentation and transfer learning in order to
improve the generality of BNNs, accompanied by computer simulation results and
testbed results showing the performance of such BNN solutions.
Related papers
- A Multi-Head Ensemble Multi-Task Learning Approach for Dynamical
Computation Offloading [62.34538208323411]
We propose a multi-head ensemble multi-task learning (MEMTL) approach with a shared backbone and multiple prediction heads (PHs)
MEMTL outperforms benchmark methods in both the inference accuracy and mean square error without requiring additional training data.
arXiv Detail & Related papers (2023-09-02T11:01:16Z) - Solving Large-scale Spatial Problems with Convolutional Neural Networks [88.31876586547848]
We employ transfer learning to improve training efficiency for large-scale spatial problems.
We propose that a convolutional neural network (CNN) can be trained on small windows of signals, but evaluated on arbitrarily large signals with little to no performance degradation.
arXiv Detail & Related papers (2023-06-14T01:24:42Z) - Decouple Graph Neural Networks: Train Multiple Simple GNNs Simultaneously Instead of One [60.5818387068983]
Graph neural networks (GNN) suffer from severe inefficiency.
We propose to decouple a multi-layer GNN as multiple simple modules for more efficient training.
We show that the proposed framework is highly efficient with reasonable performance.
arXiv Detail & Related papers (2023-04-20T07:21:32Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Recurrent Bilinear Optimization for Binary Neural Networks [58.972212365275595]
BNNs neglect the intrinsic bilinear relationship of real-valued weights and scale factors.
Our work is the first attempt to optimize BNNs from the bilinear perspective.
We obtain robust RBONNs, which show impressive performance over state-of-the-art BNNs on various models and datasets.
arXiv Detail & Related papers (2022-09-04T06:45:33Z) - TxSim:Modeling Training of Deep Neural Networks on Resistive Crossbar
Systems [3.1887081453726136]
crossbar-based computations face a major challenge due to a variety of device and circuit-level non-idealities.
We propose TxSim, a fast and customizable modeling framework to functionally evaluate DNN training on crossbar-based hardware.
arXiv Detail & Related papers (2020-02-25T19:29:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.