Constructing Deep Neural Networks with a Priori Knowledge of Wireless
Tasks
- URL: http://arxiv.org/abs/2001.11355v1
- Date: Wed, 29 Jan 2020 08:54:42 GMT
- Title: Constructing Deep Neural Networks with a Priori Knowledge of Wireless
Tasks
- Authors: Jia Guo and Chenyang Yang
- Abstract summary: Two kinds of permutation invariant properties widely existed in wireless tasks can be harnessed to reduce the number of model parameters.
We find special architecture of DNNs whose input-output relationships satisfy the properties, called permutation invariant DNN (PINN)
We take predictive resource allocation and interference coordination as examples to show how the PINNs can be employed for learning the optimal policy with unsupervised and supervised learning.
- Score: 37.060397377445504
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep neural networks (DNNs) have been employed for designing wireless systems
in many aspects, say transceiver design, resource optimization, and information
prediction. Existing works either use the fully-connected DNN or the DNNs with
particular architectures developed in other domains. While generating labels
for supervised learning and gathering training samples are time-consuming or
cost-prohibitive, how to develop DNNs with wireless priors for reducing
training complexity remains open. In this paper, we show that two kinds of
permutation invariant properties widely existed in wireless tasks can be
harnessed to reduce the number of model parameters and hence the sample and
computational complexity for training. We find special architecture of DNNs
whose input-output relationships satisfy the properties, called permutation
invariant DNN (PINN), and augment the data with the properties. By learning the
impact of the scale of a wireless system, the size of the constructed PINNs can
flexibly adapt to the input data dimension. We take predictive resource
allocation and interference coordination as examples to show how the PINNs can
be employed for learning the optimal policy with unsupervised and supervised
learning. Simulations results demonstrate a dramatic gain of the proposed PINNs
in terms of reducing training complexity.
Related papers
- Neuromorphic Wireless Split Computing with Multi-Level Spikes [69.73249913506042]
In neuromorphic computing, spiking neural networks (SNNs) perform inference tasks, offering significant efficiency gains for workloads involving sequential data.
Recent advances in hardware and software have demonstrated that embedding a few bits of payload in each spike exchanged between the spiking neurons can further enhance inference accuracy.
This paper investigates a wireless neuromorphic split computing architecture employing multi-level SNNs.
arXiv Detail & Related papers (2024-11-07T14:08:35Z) - From Alexnet to Transformers: Measuring the Non-linearity of Deep Neural Networks with Affine Optimal Transport [32.39176908225668]
We introduce the concept of the non-linearity signature of DNN, the first theoretically sound solution for measuring the non-linearity of deep neural networks.
We provide extensive experimental results that highlight the practical usefulness of the proposed non-linearity signature.
arXiv Detail & Related papers (2023-10-17T17:50:22Z) - AutoPINN: When AutoML Meets Physics-Informed Neural Networks [30.798918516407376]
PINNs enable the estimation of critical parameters, which are unobservable via physical tools, through observable variables.
Existing PINNs are often manually designed, which is time-consuming and may lead to suboptimal performance.
We propose a framework that enables the automated design of PINNs by combining AutoML and PINNs.
arXiv Detail & Related papers (2022-12-08T03:44:08Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Separable PINN: Mitigating the Curse of Dimensionality in
Physics-Informed Neural Networks [6.439575695132489]
Physics-informed neural networks (PINNs) have emerged as new data-driven PDE solvers for both forward and inverse problems.
We demonstrate that the computations in automatic differentiation (AD) can be significantly reduced by leveraging forward-mode AD when training PINN.
We propose a network architecture, called separable PINN (SPINN), which can facilitate forward-mode AD for more efficient computation.
arXiv Detail & Related papers (2022-11-16T08:46:52Z) - Learning to Learn with Generative Models of Neural Network Checkpoints [71.06722933442956]
We construct a dataset of neural network checkpoints and train a generative model on the parameters.
We find that our approach successfully generates parameters for a wide range of loss prompts.
We apply our method to different neural network architectures and tasks in supervised and reinforcement learning.
arXiv Detail & Related papers (2022-09-26T17:59:58Z) - Self-Supervised Learning of Event-Based Optical Flow with Spiking Neural
Networks [3.7384509727711923]
A major challenge for neuromorphic computing is that learning algorithms for traditional artificial neural networks (ANNs) do not transfer directly to spiking neural networks (SNNs)
In this article, we focus on the self-supervised learning problem of optical flow estimation from event-based camera inputs.
We show that the performance of the proposed ANNs and SNNs are on par with that of the current state-of-the-art ANNs trained in a self-supervised manner.
arXiv Detail & Related papers (2021-06-03T14:03:41Z) - Learning Power Control for Cellular Systems with Heterogeneous Graph
Neural Network [37.060397377445504]
We show that the power control policy has a combination of different PI and PE properties, and existing HetGNN does not satisfy these properties.
We design a parameter sharing scheme for HetGNN such that the learned relationship satisfies the desired properties.
arXiv Detail & Related papers (2020-11-06T02:41:38Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Self-Organized Operational Neural Networks with Generative Neurons [87.32169414230822]
ONNs are heterogenous networks with a generalized neuron model that can encapsulate any set of non-linear operators.
We propose Self-organized ONNs (Self-ONNs) with generative neurons that have the ability to adapt (optimize) the nodal operator of each connection.
arXiv Detail & Related papers (2020-04-24T14:37:56Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.