Dynamic Neural Networks: A Survey
- URL: http://arxiv.org/abs/2102.04906v2
- Date: Wed, 10 Feb 2021 13:45:05 GMT
- Title: Dynamic Neural Networks: A Survey
- Authors: Yizeng Han, Gao Huang, Shiji Song, Le Yang, Honghui Wang, Yulin Wang
- Abstract summary: Dynamic neural network is an emerging research topic in deep learning.
Dynamic networks can adapt their structures or parameters to different inputs, leading to notable advantages in terms of accuracy, computational efficiency, adaptiveness, etc.
- Score: 34.30356864359789
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dynamic neural network is an emerging research topic in deep learning.
Compared to static models which have fixed computational graphs and parameters
at the inference stage, dynamic networks can adapt their structures or
parameters to different inputs, leading to notable advantages in terms of
accuracy, computational efficiency, adaptiveness, etc. In this survey, we
comprehensively review this rapidly developing area by dividing dynamic
networks into three main categories: 1) instance-wise dynamic models that
process each instance with data-dependent architectures or parameters; 2)
spatial-wise dynamic networks that conduct adaptive computation with respect to
different spatial locations of image data and 3) temporal-wise dynamic models
that perform adaptive inference along the temporal dimension for sequential
data such as videos and texts. The important research problems of dynamic
networks, e.g., architecture design, decision making scheme, optimization
technique and applications, are reviewed systematically. Finally, we discuss
the open problems in this field together with interesting future research
directions.
Related papers
- Task-Oriented Real-time Visual Inference for IoVT Systems: A Co-design Framework of Neural Networks and Edge Deployment [61.20689382879937]
Task-oriented edge computing addresses this by shifting data analysis to the edge.
Existing methods struggle to balance high model performance with low resource consumption.
We propose a novel co-design framework to optimize neural network architecture.
arXiv Detail & Related papers (2024-10-29T19:02:54Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Learning Continuous Network Emerging Dynamics from Scarce Observations
via Data-Adaptive Stochastic Processes [11.494631894700253]
We introduce ODE Processes for Network Dynamics (NDP4ND), a new class of processes governed by data-adaptive network dynamics.
We show that the proposed method has excellent data and computational efficiency, and can adapt to unseen network emerging dynamics.
arXiv Detail & Related papers (2023-10-25T08:44:05Z) - The Underlying Correlated Dynamics in Neural Training [6.385006149689549]
Training of neural networks is a computationally intensive task.
We propose a model based on the correlation of the parameters' dynamics, which dramatically reduces the dimensionality.
This representation enhances the understanding of the underlying training dynamics and can pave the way for designing better acceleration techniques.
arXiv Detail & Related papers (2022-12-18T08:34:11Z) - Gradient-Based Trajectory Optimization With Learned Dynamics [80.41791191022139]
We use machine learning techniques to learn a differentiable dynamics model of the system from data.
We show that a neural network can model highly nonlinear behaviors accurately for large time horizons.
In our hardware experiments, we demonstrate that our learned model can represent complex dynamics for both the Spot and Radio-controlled (RC) car.
arXiv Detail & Related papers (2022-04-09T22:07:34Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Dynamic Network Embedding Survey [11.742863376032112]
We give a survey of dynamic network embedding in this paper.
We present two basic data models, namely, discrete model and continuous model for dynamic networks.
We build a taxonomy that refines the category hierarchy by typical learning models.
arXiv Detail & Related papers (2021-03-29T09:27:53Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - Deep learning of contagion dynamics on complex networks [0.0]
We propose a complementary approach based on deep learning to build effective models of contagion dynamics on networks.
By allowing simulations on arbitrary network structures, our approach makes it possible to explore the properties of the learned dynamics beyond the training data.
Our results demonstrate how deep learning offers a new and complementary perspective to build effective models of contagion dynamics on networks.
arXiv Detail & Related papers (2020-06-09T17:18:34Z) - Foundations and modelling of dynamic networks using Dynamic Graph Neural
Networks: A survey [11.18312489268624]
We establish a foundation of dynamic networks with consistent, detailed terminology and notation.
We present a comprehensive survey of dynamic graph neural network models using the proposed terminology.
arXiv Detail & Related papers (2020-05-13T23:56:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.