Neural Functions for Learning Periodic Signal
- URL: http://arxiv.org/abs/2506.09526v1
- Date: Wed, 11 Jun 2025 08:52:01 GMT
- Title: Neural Functions for Learning Periodic Signal
- Authors: Woojin Cho, Minju Jo, Kookjin Lee, Noseong Park,
- Abstract summary: We propose a novel network architecture, which extracts periodic patterns from measurements and leverages this information to represent the signal.<n>We demonstrate the efficacy of the proposed method through comprehensive experiments, including the learning of the periodic solutions for differential equations.
- Score: 25.278349611091983
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: As function approximators, deep neural networks have served as an effective tool to represent various signal types. Recent approaches utilize multi-layer perceptrons (MLPs) to learn a nonlinear mapping from a coordinate to its corresponding signal, facilitating the learning of continuous neural representations from discrete data points. Despite notable successes in learning diverse signal types, coordinate-based MLPs often face issues of overfitting and limited generalizability beyond the training region, resulting in subpar extrapolation performance. This study addresses scenarios where the underlying true signals exhibit periodic properties, either spatially or temporally. We propose a novel network architecture, which extracts periodic patterns from measurements and leverages this information to represent the signal, thereby enhancing generalization and improving extrapolation performance. We demonstrate the efficacy of the proposed method through comprehensive experiments, including the learning of the periodic solutions for differential equations, and time series imputation (interpolation) and forecasting (extrapolation) on real-world datasets.
Related papers
- Machine Learning in Short-Reach Optical Systems: A Comprehensive Survey [2.425630641479336]
This paper outlines the application of machine learning techniques in short-reach communications.
We introduce a novel taxonomy for time-series methods employed in machine learning signal processing.
We aim to pave the way for more practical and efficient deployment of machine learning approaches in short-reach optical communication systems.
arXiv Detail & Related papers (2024-05-02T16:04:30Z) - ResFields: Residual Neural Fields for Spatiotemporal Signals [61.44420761752655]
ResFields is a novel class of networks specifically designed to effectively represent complex temporal signals.
We conduct comprehensive analysis of the properties of ResFields and propose a matrix factorization technique to reduce the number of trainable parameters.
We demonstrate the practical utility of ResFields by showcasing its effectiveness in capturing dynamic 3D scenes from sparse RGBD cameras.
arXiv Detail & Related papers (2023-09-06T16:59:36Z) - FFEINR: Flow Feature-Enhanced Implicit Neural Representation for
Spatio-temporal Super-Resolution [4.577685231084759]
This paper proposes a Feature-Enhanced Neural Implicit Representation (FFEINR) for super-resolution of flow field data.
It can take full advantage of the implicit neural representation in terms of model structure and sampling resolution.
The training process of FFEINR is facilitated by introducing feature enhancements for the input layer.
arXiv Detail & Related papers (2023-08-24T02:28:18Z) - RRCNN: A novel signal decomposition approach based on recurrent residue
convolutional neural network [7.5123109191537205]
We propose a new non-stationary signal decomposition method under the framework of deep learning.
We use the convolutional neural network, residual structure and nonlinear activation function to compute in an innovative way the local average of the signal.
In the experiments, we evaluate the performance of the proposed model from two points of view: the calculation of the local average and the signal decomposition.
arXiv Detail & Related papers (2023-07-04T13:53:01Z) - Understanding of the properties of neural network approaches for
transient light curve approximations [37.91290708320157]
This paper presents a search for the best-performing methods to approximate the observed light curves over time and wavelength.
Test datasets include simulated PLAsTiCC and real Zwicky Transient Facility Bright Transient Survey light curves of transients.
arXiv Detail & Related papers (2022-09-15T18:00:08Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Meta-Learning Sparse Implicit Neural Representations [69.15490627853629]
Implicit neural representations are a promising new avenue of representing general signals.
Current approach is difficult to scale for a large number of signals or a data set.
We show that meta-learned sparse neural representations achieve a much smaller loss than dense meta-learned models.
arXiv Detail & Related papers (2021-10-27T18:02:53Z) - Signal Transformer: Complex-valued Attention and Meta-Learning for
Signal Recognition [33.178794056273304]
We propose a Complex-valued Attentional MEta Learner (CAMEL) for the problem few of general nonvalued problems with theoretical convergence guarantees.
This paper shows the superiority of the proposed data recognition experiments when the state is abundant small data.
arXiv Detail & Related papers (2021-06-05T03:57:41Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Meta-Learning for Koopman Spectral Analysis with Short Time-series [49.41640137945938]
Existing methods require long time-series for training neural networks.
We propose a meta-learning method for estimating embedding functions from unseen short time-series.
We experimentally demonstrate that the proposed method achieves better performance in terms of eigenvalue estimation and future prediction.
arXiv Detail & Related papers (2021-02-09T07:19:19Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z) - Multi-Scale Neural network for EEG Representation Learning in BCI [2.105172041656126]
We propose a novel deep multi-scale neural network that discovers feature representations in multiple frequency/time ranges.
By representing EEG signals withspectral-temporal information, the proposed method can be utilized for diverse paradigms.
arXiv Detail & Related papers (2020-03-02T04:06:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.