Deep physical neural networks enabled by a backpropagation algorithm for
arbitrary physical systems
- URL: http://arxiv.org/abs/2104.13386v1
- Date: Tue, 27 Apr 2021 18:00:02 GMT
- Title: Deep physical neural networks enabled by a backpropagation algorithm for
arbitrary physical systems
- Authors: Logan G. Wright, Tatsuhiro Onodera, Martin M. Stein, Tianyu Wang,
Darren T. Schachter, Zoey Hu, Peter L. McMahon
- Abstract summary: We propose a radical alternative for implementing deep neural network models: Physical Neural Networks.
We introduce a hybrid physical-digital algorithm called Physics-Aware Training to efficiently train sequences of controllable physical systems to act as deep neural networks.
- Score: 3.7785805908699803
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep neural networks have become a pervasive tool in science and engineering.
However, modern deep neural networks' growing energy requirements now
increasingly limit their scaling and broader use. We propose a radical
alternative for implementing deep neural network models: Physical Neural
Networks. We introduce a hybrid physical-digital algorithm called Physics-Aware
Training to efficiently train sequences of controllable physical systems to act
as deep neural networks. This method automatically trains the functionality of
any sequence of real physical systems, directly, using backpropagation, the
same technique used for modern deep neural networks. To illustrate their
generality, we demonstrate physical neural networks with three diverse physical
systems-optical, mechanical, and electrical. Physical neural networks may
facilitate unconventional machine learning hardware that is orders of magnitude
faster and more energy efficient than conventional electronic processors.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Design and development of opto-neural processors for simulation of
neural networks trained in image detection for potential implementation in
hybrid robotics [0.0]
Living neural networks offer advantages of lower power consumption, faster processing, and biological realism.
This work proposes a simulated living neural network trained indirectly by backpropagating STDP based algorithms using precision activation by optogenetics.
arXiv Detail & Related papers (2024-01-17T04:42:49Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Physics-Guided, Physics-Informed, and Physics-Encoded Neural Networks in
Scientific Computing [0.0]
Recent breakthroughs in computing power have made it feasible to use machine learning and deep learning to advance scientific computing.
Due to their intrinsic architecture, conventional neural networks cannot be successfully trained and scoped when data is sparse.
Neural networks offer a strong foundation to digest physical-driven or knowledge-based constraints.
arXiv Detail & Related papers (2022-11-14T15:44:07Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Deep Spiking Convolutional Neural Network for Single Object Localization
Based On Deep Continuous Local Learning [0.0]
We propose a deep convolutional spiking neural network for the localization of a single object in a grayscale image.
Results reported on Oxford-IIIT-Pet validates the exploitation of spiking neural networks with a supervised learning approach.
arXiv Detail & Related papers (2021-05-12T12:02:05Z) - Explainable artificial intelligence for mechanics: physics-informing
neural networks for constitutive models [0.0]
In mechanics, the new and active field of physics-informed neural networks attempts to mitigate this disadvantage by designing deep neural networks on the basis of mechanical knowledge.
We propose a first step towards a physics-forming-in approach, which explains neural networks trained on mechanical data a posteriori.
Therein, the principal component analysis decorrelates the distributed representations in cell states of RNNs and allows the comparison to known and fundamental functions.
arXiv Detail & Related papers (2021-04-20T18:38:52Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z) - A$^3$: Accelerating Attention Mechanisms in Neural Networks with
Approximation [3.5217810503607896]
We design and architect A3, which accelerates attention mechanisms in neural networks with algorithmic approximation and hardware specialization.
Our proposed accelerator achieves multiple orders of magnitude improvement in energy efficiency (performance/watt) as well as substantial speedup over the state-of-the-art conventional hardware.
arXiv Detail & Related papers (2020-02-22T02:09:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.